Apr 22 17:34:42.556668 ip-10-0-143-10 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:42.927591 ip-10-0-143-10 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:42.927591 ip-10-0-143-10 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:42.927591 ip-10-0-143-10 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:42.927591 ip-10-0-143-10 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:42.927591 ip-10-0-143-10 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:42.929161 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.929024 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:42.932458 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932445 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:42.932458 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932458 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932461 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932465 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932467 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932470 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932472 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932475 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932478 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932481 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932483 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932486 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932492 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932494 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932497 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932499 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932507 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932510 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932513 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932516 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932518 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:42.932525 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932521 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932523 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932526 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932529 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932531 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932534 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932537 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932540 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932542 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932545 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932549 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932553 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932556 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932558 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932561 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932563 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932565 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932568 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932571 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:42.933072 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932573 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932575 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932578 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932580 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932583 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932585 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932588 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932590 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932592 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932595 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932597 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932600 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932602 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932605 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932607 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932610 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932613 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932615 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932618 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932629 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:42.933581 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932633 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932637 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932640 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932643 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932646 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932649 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932652 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932654 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932657 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932660 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932662 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932664 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932667 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932671 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932673 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932675 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932678 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932681 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932683 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932686 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:42.934098 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932688 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932691 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932694 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932696 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932699 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.932701 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933074 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933081 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933084 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933086 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933089 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933092 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933095 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933097 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933100 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933102 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933104 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933107 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933115 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933118 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:42.934601 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933121 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933125 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933129 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933132 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933135 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933138 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933140 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933143 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933146 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933149 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933152 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933154 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933157 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933160 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933163 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933166 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933168 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933171 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933174 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:42.935078 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933177 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933180 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933182 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933185 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933187 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933190 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933193 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933195 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933198 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933200 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933203 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933205 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933213 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933215 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933218 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933220 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933223 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933225 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933227 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933230 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:42.935550 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933232 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933235 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933237 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933240 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933242 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933245 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933247 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933249 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933252 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933254 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933258 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933262 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933264 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933268 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933271 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933274 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933276 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933279 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933281 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933284 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:42.936038 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933286 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933288 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933291 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933293 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933295 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933298 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933301 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933303 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933306 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933308 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933310 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933312 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.933315 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934419 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934432 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934440 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934445 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934450 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934453 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934457 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934461 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:42.936532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934464 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934467 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934471 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934475 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934478 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934480 2570 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934483 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934486 2570 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934489 2570 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934492 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934494 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934499 2570 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934502 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934505 2570 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934507 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934511 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934514 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934523 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934526 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934529 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934532 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934535 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934538 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934541 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934544 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:42.937038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934548 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934551 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934553 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934556 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934559 2570 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934562 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934569 2570 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934572 2570 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934575 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934578 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934581 2570 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934585 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934588 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934590 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934593 2570 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934596 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934599 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934602 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934605 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934607 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934610 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934613 2570 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934617 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934620 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934623 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:42.937655 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934631 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934634 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934637 2570 flags.go:64] FLAG: --help="false" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934640 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-143-10.ec2.internal" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934643 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934646 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934649 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934652 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934656 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934658 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934661 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934664 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934667 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934670 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934673 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934676 2570 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934679 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934682 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934685 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934688 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934691 2570 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934694 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934696 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934699 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:42.938269 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934704 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934707 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934710 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934712 2570 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934715 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934718 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934721 2570 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934724 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934728 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934739 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934743 2570 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934746 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934749 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934752 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934754 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934757 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934760 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934763 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934770 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934773 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934776 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934779 2570 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934782 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:42.938911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934787 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934790 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934793 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934796 2570 flags.go:64] FLAG: --port="10250" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934799 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934802 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e06a8369cffe368d" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934805 2570 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934808 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934810 2570 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934813 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934816 2570 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934819 2570 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934822 2570 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934825 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934827 2570 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934831 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934833 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934836 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934839 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934849 2570 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934852 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934855 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934858 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934861 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934864 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934867 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:42.939482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934869 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934872 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934875 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934878 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934881 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934884 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934887 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934889 2570 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934892 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934897 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934900 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934903 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934907 2570 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934909 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934912 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934915 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934918 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934921 2570 flags.go:64] FLAG: --v="2" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934926 2570 flags.go:64] FLAG: --version="false" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934932 2570 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934936 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.934939 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935048 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935053 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:42.940113 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935056 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935059 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935062 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935065 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935068 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935071 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935074 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935076 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935079 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935081 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935084 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935087 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935089 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935092 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935094 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935097 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935099 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935102 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935104 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935107 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:42.940726 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935109 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935112 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935114 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935119 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935122 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935124 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935127 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935130 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935132 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935134 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935137 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935139 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935142 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935144 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935146 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935149 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935152 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935154 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935157 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935159 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:42.941225 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935161 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935164 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935167 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935170 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935173 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935175 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935178 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935180 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935182 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935185 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935187 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935190 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935192 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935194 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935197 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935200 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935203 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935205 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935208 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935210 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:42.941723 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935212 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935215 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935217 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935220 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935222 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935224 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935227 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935229 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935232 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935235 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935237 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935239 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935243 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935246 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935249 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935253 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935256 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935258 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935261 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:42.942239 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935265 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935268 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935271 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935273 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.935276 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.935283 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.941889 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.941904 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941952 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941957 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941961 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941964 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941966 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941969 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941971 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941974 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:42.942713 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941976 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941979 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941982 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941984 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941986 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941989 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941991 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941994 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941996 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.941999 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942001 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942004 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942006 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942009 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942011 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942015 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942020 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942023 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942025 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:42.943123 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942028 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942031 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942033 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942036 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942040 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942042 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942045 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942047 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942050 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942052 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942054 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942057 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942060 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942062 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942065 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942067 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942070 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942072 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942074 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:42.943604 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942077 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942079 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942082 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942084 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942087 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942089 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942091 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942094 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942096 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942099 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942101 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942104 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942106 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942109 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942112 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942114 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942117 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942119 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942123 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942125 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:42.944086 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942128 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942130 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942133 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942135 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942137 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942140 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942143 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942145 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942148 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942150 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942154 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942158 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942160 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942163 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942166 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942168 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942171 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942173 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942175 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:42.944576 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942178 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.942183 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942288 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942293 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942297 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942300 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942303 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942305 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942308 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942311 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942313 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942316 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942319 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942322 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942324 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:42.945024 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942327 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942329 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942332 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942334 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942337 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942339 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942341 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942344 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942346 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942349 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942351 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942354 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942356 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942358 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942361 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942364 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942368 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942372 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942375 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942379 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:42.945474 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942382 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942385 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942388 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942390 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942393 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942417 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942422 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942425 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942428 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942431 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942435 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942437 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942440 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942442 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942445 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942447 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942450 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942452 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942455 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942457 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:42.945974 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942460 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942462 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942464 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942467 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942469 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942472 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942474 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942477 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942479 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942482 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942484 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942487 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942490 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942492 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942494 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942497 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942499 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942501 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942504 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942506 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:42.946549 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942508 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942511 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942513 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942516 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942518 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942521 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942523 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942525 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942528 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942530 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942532 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942535 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:42.942538 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.942542 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.943208 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:42.947116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.945157 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:42.947544 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.945980 2570 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:42.947544 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.946091 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:42.947544 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.946809 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:42.965972 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.965955 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:42.968166 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.968146 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:42.985291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.985275 2570 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:42.990202 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.990186 2570 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:42.991394 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.991376 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:42.994702 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.994674 2570 fs.go:135] Filesystem UUIDs: map[2909e569-b71e-4e34-a7c9-69b626351c39:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b7add4df-40f8-402c-918e-8548f2d43cb8:/dev/nvme0n1p4] Apr 22 17:34:42.994702 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.994696 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:42.996521 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.996503 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:43.000051 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:42.999942 2570 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:42.998268862 +0000 UTC m=+0.339267924 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100817 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fa60b29fc301ff1cce02e254d0cd4 SystemUUID:ec2fa60b-29fc-301f-f1cc-e02e254d0cd4 BootID:39fd9299-e5bd-40a6-b007-63b41a9993f3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:17:79:15:a9:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:17:79:15:a9:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:d3:44:5a:a0:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:43.000051 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.000042 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:43.000181 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.000113 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:43.001049 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001025 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:43.001184 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001052 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-10.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:43.001230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001192 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:43.001230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001201 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:43.001230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001214 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:43.001925 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.001915 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:43.003190 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.003179 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:43.003290 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.003282 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:43.005363 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.005353 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:43.005395 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.005372 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:43.005395 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.005384 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:43.005395 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.005392 2570 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:43.005530 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.005411 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:43.006383 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.006372 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:43.006427 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.006391 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:43.009029 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.009012 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:43.010350 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.010336 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:43.011909 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011898 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011915 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011922 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011928 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011934 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011939 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:43.011944 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011945 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:43.012114 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011950 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:43.012114 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011957 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:43.012114 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011963 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:43.012114 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011971 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:43.012114 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.011980 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:43.012719 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.012708 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:43.012748 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.012720 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:43.016108 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.016088 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-10.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:43.016258 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.016240 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-10.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:43.016337 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.016322 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:43.017016 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.017003 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:43.017050 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.017041 2570 server.go:1295] "Started kubelet" Apr 22 17:34:43.017163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.017113 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:43.018228 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.017199 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:43.018368 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.018353 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:43.019044 ip-10-0-143-10 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:43.019840 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.019749 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:43.022741 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.022727 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:43.024375 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.023518 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-10.ec2.internal.18a8be4b28fae888 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-10.ec2.internal,UID:ip-10-0-143-10.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-10.ec2.internal,},FirstTimestamp:2026-04-22 17:34:43.01701748 +0000 UTC m=+0.358016550,LastTimestamp:2026-04-22 17:34:43.01701748 +0000 UTC m=+0.358016550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-10.ec2.internal,}" Apr 22 17:34:43.027681 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.027662 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:43.028252 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028238 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:43.028252 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028244 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:43.028835 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028818 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:43.028835 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028822 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:43.028937 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028845 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:43.028967 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028944 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:43.028967 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.028951 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:43.029052 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.028997 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.029377 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029360 2570 factory.go:55] Registering systemd factory Apr 22 17:34:43.029484 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029381 2570 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:43.029613 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029599 2570 factory.go:153] Registering CRI-O factory Apr 22 17:34:43.029613 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029613 2570 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:43.029786 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029660 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:43.029786 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029682 2570 factory.go:103] Registering Raw factory Apr 22 17:34:43.029786 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.029696 2570 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:43.030165 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.030152 2570 manager.go:319] Starting recovery of all containers Apr 22 17:34:43.037142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.037123 2570 manager.go:324] Recovery completed Apr 22 17:34:43.042148 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.042000 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:43.042203 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.042001 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-10.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:34:43.042300 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.042288 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.044426 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044411 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.044497 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044439 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.044497 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044449 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.044964 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044944 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:43.044964 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044956 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jbkhn" Apr 22 17:34:43.044964 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044962 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:43.045131 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.044988 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:43.047589 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.047578 2570 policy_none.go:49] "None policy: Start" Apr 22 17:34:43.047640 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.047593 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:43.047640 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.047602 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:43.054272 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.054211 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jbkhn" Apr 22 17:34:43.055580 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.055464 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-10.ec2.internal.18a8be4b2a9d214a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-10.ec2.internal,UID:ip-10-0-143-10.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-10.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-10.ec2.internal,},FirstTimestamp:2026-04-22 17:34:43.044426058 +0000 UTC m=+0.385425121,LastTimestamp:2026-04-22 17:34:43.044426058 +0000 UTC m=+0.385425121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-10.ec2.internal,}" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085538 2570 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.085562 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085573 2570 server.go:85] "Starting device plugin registration server" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085782 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085792 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085885 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085975 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.085984 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.086362 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:43.121855 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.086388 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.161058 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.161028 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:43.162347 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.162333 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:43.162443 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.162359 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:43.162443 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.162381 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:43.162443 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.162391 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:43.162580 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.162442 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:43.165519 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.165505 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:43.186476 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.186440 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.187228 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.187215 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.187275 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.187243 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.187275 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.187253 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.187275 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.187274 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.197516 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.197498 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.197558 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.197522 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-10.ec2.internal\": node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.214770 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.214750 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.262556 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.262526 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal"] Apr 22 17:34:43.262645 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.262592 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.263964 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.263949 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.264054 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.263980 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.264054 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.263994 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.265254 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265239 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.265472 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.265529 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265493 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.265846 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265830 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.265933 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265855 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.265933 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.265865 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.266076 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.266063 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.266116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.266088 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.266116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.266098 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.267108 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.267092 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.267178 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.267127 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.267743 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.267728 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.267809 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.267758 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.267809 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.267771 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.301910 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.301893 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-10.ec2.internal\" not found" node="ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.305046 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.305030 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-10.ec2.internal\" not found" node="ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.315088 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.315072 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.330303 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.330276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.330372 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.330304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f35ec5c62ed7016564c2db426231f954-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f35ec5c62ed7016564c2db426231f954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.330372 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.330322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.415513 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.415493 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.430914 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.430893 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.430914 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.430844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.431019 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.430945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.431019 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.430969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f35ec5c62ed7016564c2db426231f954-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f35ec5c62ed7016564c2db426231f954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.431019 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.431002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f35ec5c62ed7016564c2db426231f954-config\") pod \"kube-apiserver-proxy-ip-10-0-143-10.ec2.internal\" (UID: \"f35ec5c62ed7016564c2db426231f954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.431110 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.431036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/267d88ca40ef57075d73d52347c0c18c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal\" (UID: \"267d88ca40ef57075d73d52347c0c18c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.516208 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.516193 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.603668 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.603645 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.608257 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.608243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:43.616963 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.616944 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.717470 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.717437 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.817996 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.817973 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.918479 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:43.918454 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:43.946010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.945987 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:43.946495 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:43.946148 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:44.019326 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:44.019302 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:44.029385 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.029362 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:44.045294 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.045246 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:44.057451 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.057394 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:43 +0000 UTC" deadline="2028-01-20 08:04:19.615006042 +0000 UTC" Apr 22 17:34:44.057451 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.057437 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15302h29m35.557572111s" Apr 22 17:34:44.087180 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.087159 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pcp5x" Apr 22 17:34:44.098312 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.098260 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pcp5x" Apr 22 17:34:44.098569 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:44.098533 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35ec5c62ed7016564c2db426231f954.slice/crio-88eb8b88921f1ab5f5caea06564a5d1d64cae5cf9e838822bc37c23f0a1f5340 WatchSource:0}: Error finding container 88eb8b88921f1ab5f5caea06564a5d1d64cae5cf9e838822bc37c23f0a1f5340: Status 404 returned error can't find the container with id 88eb8b88921f1ab5f5caea06564a5d1d64cae5cf9e838822bc37c23f0a1f5340 Apr 22 17:34:44.100270 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:44.100245 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267d88ca40ef57075d73d52347c0c18c.slice/crio-42b2f0abceec972e702bb27459d56b93b48e9e85ef14622e3db350418fdc6efa WatchSource:0}: Error finding container 42b2f0abceec972e702bb27459d56b93b48e9e85ef14622e3db350418fdc6efa: Status 404 returned error can't find the container with id 42b2f0abceec972e702bb27459d56b93b48e9e85ef14622e3db350418fdc6efa Apr 22 17:34:44.104161 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.104145 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:44.119668 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:44.119646 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-10.ec2.internal\" not found" Apr 22 17:34:44.147192 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.147173 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:44.163721 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.163704 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:44.165028 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.164994 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" event={"ID":"f35ec5c62ed7016564c2db426231f954","Type":"ContainerStarted","Data":"88eb8b88921f1ab5f5caea06564a5d1d64cae5cf9e838822bc37c23f0a1f5340"} Apr 22 17:34:44.165927 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.165909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"267d88ca40ef57075d73d52347c0c18c","Type":"ContainerStarted","Data":"42b2f0abceec972e702bb27459d56b93b48e9e85ef14622e3db350418fdc6efa"} Apr 22 17:34:44.191159 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.191142 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:44.228727 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.228711 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" Apr 22 17:34:44.246547 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.246532 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:44.247345 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.247334 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" Apr 22 17:34:44.263874 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:44.263856 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:45.006934 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.006898 2570 apiserver.go:52] "Watching apiserver" Apr 22 17:34:45.022256 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.022228 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:45.022639 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.022613 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-mltbp","openshift-multus/multus-8vkqt","openshift-multus/multus-additional-cni-plugins-2z7l2","openshift-network-operator/iptables-alerter-z7qf2","openshift-ovn-kubernetes/ovnkube-node-tpwrl","kube-system/konnectivity-agent-tdzw6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl","openshift-dns/node-resolver-8fsg9","openshift-image-registry/node-ca-zff9d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal","openshift-multus/network-metrics-daemon-srjdz","openshift-network-diagnostics/network-check-target-52lnt","kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal"] Apr 22 17:34:45.025753 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.025729 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.025850 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.025829 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.027004 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.026981 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.028182 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.028162 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.030111 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.029763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.030686 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.030669 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:45.031077 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.031057 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.032345 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.032328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.033081 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.032822 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:45.033927 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.033909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.034427 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.034391 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.034510 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.034458 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.034561 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.034417 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:45.034851 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.034837 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.035311 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.035294 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.036691 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.036673 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.036783 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.036758 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:45.038222 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038205 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:45.038299 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.038255 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:45.038741 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-systemd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.038823 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038753 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.038823 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-multus-daemon-config\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.038823 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-ovn\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.038823 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-device-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.038994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-netns\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.038994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038923 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-kubelet\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.038994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038950 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-etc-kubernetes\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.038994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.038986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039012 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdtw\" (UniqueName: \"kubernetes.io/projected/d771efda-eabb-43a6-b033-a798880231b1-kube-api-access-xjdtw\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039039 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5d2v\" (UniqueName: \"kubernetes.io/projected/5b37faae-6be3-4973-8048-7a21fab3256d-kube-api-access-h5d2v\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-cnibin\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-system-cni-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.039191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-bin\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-config\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039237 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-script-lib\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039266 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hql8\" (UniqueName: \"kubernetes.io/projected/7a77318f-12ee-48f6-8626-11e2875f970b-kube-api-access-6hql8\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-node-log\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039314 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-registration-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-kubelet\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovn-node-metrics-cert\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039394 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-system-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039436 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039433 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-hostroot\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-multus-certs\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039479 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-systemd-units\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-netd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-env-overrides\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-var-lib-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-etc-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-cnibin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-cni-binary-copy\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-socket-dir-parent\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-slash\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-os-release\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-multus\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.039798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039738 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d771efda-eabb-43a6-b033-a798880231b1-host-slash\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039841 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d771efda-eabb-43a6-b033-a798880231b1-iptables-alerter-script\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039935 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-log-socket\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-socket-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.039997 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-os-release\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040036 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-netns\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040053 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlmb\" (UniqueName: \"kubernetes.io/projected/acc3c714-ca80-45fe-a1b0-14e012c3d912-kube-api-access-4nlmb\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-bin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040173 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-conf-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-sys-fs\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-k8s-cni-cncf-io\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.040281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.040250 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mxc\" (UniqueName: \"kubernetes.io/projected/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kube-api-access-64mxc\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.047632 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.047534 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ntf6c\"" Apr 22 17:34:45.047632 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.047621 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9pgcf\"" Apr 22 17:34:45.047894 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.047869 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:45.047987 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.047930 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:45.048070 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048051 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:45.048120 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048053 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:45.048171 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048161 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.048365 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048337 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.048483 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048347 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:45.050421 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.048796 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:45.050421 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.049098 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.050421 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.049787 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9vmk9\"" Apr 22 17:34:45.050421 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.050030 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:45.050421 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.050068 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.050704 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.050443 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.050760 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.050704 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fnvgp\"" Apr 22 17:34:45.050760 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.050721 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.051101 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.051083 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.051278 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.051260 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xphvg\"" Apr 22 17:34:45.051343 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.051315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.051634 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.051611 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.052926 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.051910 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:45.052926 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.052314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-t4d8l\"" Apr 22 17:34:45.052926 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.052614 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:45.053276 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.052927 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:45.053454 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.053435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ptgzc\"" Apr 22 17:34:45.053533 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.053466 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8n2vv\"" Apr 22 17:34:45.053533 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.053473 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:45.053533 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.053517 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:45.053917 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.053899 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-58pbr\"" Apr 22 17:34:45.077633 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.077618 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:45.099434 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.099384 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:44 +0000 UTC" deadline="2027-11-28 01:43:40.077795681 +0000 UTC" Apr 22 17:34:45.099434 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.099430 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14024h8m54.97837033s" Apr 22 17:34:45.130158 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.130139 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:45.141189 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-config\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-script-lib\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hql8\" (UniqueName: \"kubernetes.io/projected/7a77318f-12ee-48f6-8626-11e2875f970b-kube-api-access-6hql8\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.141281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-host\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.141452 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-node-log\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141508 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.141508 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-registration-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.141566 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-node-log\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141566 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-modprobe-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.141636 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-var-lib-kubelet\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.141636 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141608 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-tuned\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.141714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141638 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83519001-bdda-4c9d-ab90-db32b4638392-hosts-file\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.141714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-registration-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.141714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb6sm\" (UniqueName: \"kubernetes.io/projected/0145db4f-d1c7-42f4-8607-b305371c3756-kube-api-access-jb6sm\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.141714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-kubelet\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.141854 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovn-node-metrics-cert\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141854 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-system-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.141854 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141779 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-hostroot\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.141854 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-multus-certs\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.141974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/96482cfc-a3ad-4187-a8c4-419fcd27a81f-agent-certs\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.141974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-systemd-units\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-config\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovnkube-script-lib\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.141974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-netd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141976 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-env-overrides\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141989 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-multus-certs\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.141997 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-hostroot\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-kubernetes\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142033 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-systemd-units\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-kubelet\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142055 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-system-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-systemd\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-run\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.142125 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-sys\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-netd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/96482cfc-a3ad-4187-a8c4-419fcd27a81f-konnectivity-ca\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-var-lib-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-etc-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-var-lib-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142314 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-etc-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-cnibin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-cni-binary-copy\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142362 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-cnibin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/acc3c714-ca80-45fe-a1b0-14e012c3d912-env-overrides\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-socket-dir-parent\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-socket-dir-parent\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-tmp\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24gg\" (UniqueName: \"kubernetes.io/projected/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-kube-api-access-b24gg\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-slash\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.142554 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-os-release\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-multus\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142610 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-host\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142635 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-serviceca\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d771efda-eabb-43a6-b033-a798880231b1-host-slash\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-os-release\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83519001-bdda-4c9d-ab90-db32b4638392-tmp-dir\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d771efda-eabb-43a6-b033-a798880231b1-iptables-alerter-script\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-log-socket\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142792 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-socket-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-os-release\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142836 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-multus\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142852 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-conf\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142892 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-log-socket\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.143371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-netns\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142893 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-binary-copy\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142965 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-os-release\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d771efda-eabb-43a6-b033-a798880231b1-host-slash\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143014 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-socket-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.142837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-cni-binary-copy\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143060 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-netns\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-openvswitch\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlmb\" (UniqueName: \"kubernetes.io/projected/acc3c714-ca80-45fe-a1b0-14e012c3d912-kube-api-access-4nlmb\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-bin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-conf-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-sys-fs\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d771efda-eabb-43a6-b033-a798880231b1-iptables-alerter-script\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.144144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143225 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-cni-bin\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-k8s-cni-cncf-io\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-slash\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-conf-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysconfig\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-k8s-cni-cncf-io\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfzw5\" (UniqueName: \"kubernetes.io/projected/06f15cdd-966a-45fd-8c56-bb8afca86d95-kube-api-access-dfzw5\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64mxc\" (UniqueName: \"kubernetes.io/projected/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kube-api-access-64mxc\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-sys-fs\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-lib-modules\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-etc-selinux\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-systemd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143438 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-systemd\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-multus-daemon-config\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.144887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-ovn\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143495 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-device-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-netns\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-kubelet\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-multus-cni-dir\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143528 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-run-ovn\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-etc-kubernetes\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-device-dir\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143574 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2vv\" (UniqueName: \"kubernetes.io/projected/83519001-bdda-4c9d-ab90-db32b4638392-kube-api-access-4d2vv\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-etc-kubernetes\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdtw\" (UniqueName: \"kubernetes.io/projected/d771efda-eabb-43a6-b033-a798880231b1-kube-api-access-xjdtw\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-var-lib-kubelet\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5d2v\" (UniqueName: \"kubernetes.io/projected/5b37faae-6be3-4973-8048-7a21fab3256d-kube-api-access-h5d2v\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-cnibin\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-system-cni-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.145621 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143681 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-bin\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-cni-bin\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b37faae-6be3-4973-8048-7a21fab3256d-multus-daemon-config\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143917 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acc3c714-ca80-45fe-a1b0-14e012c3d912-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b37faae-6be3-4973-8048-7a21fab3256d-host-run-netns\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-system-cni-dir\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.143998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a77318f-12ee-48f6-8626-11e2875f970b-cnibin\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.144368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a77318f-12ee-48f6-8626-11e2875f970b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.146163 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.146005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/acc3c714-ca80-45fe-a1b0-14e012c3d912-ovn-node-metrics-cert\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.151803 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.151772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hql8\" (UniqueName: \"kubernetes.io/projected/7a77318f-12ee-48f6-8626-11e2875f970b-kube-api-access-6hql8\") pod \"multus-additional-cni-plugins-2z7l2\" (UID: \"7a77318f-12ee-48f6-8626-11e2875f970b\") " pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.155257 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.155231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdtw\" (UniqueName: \"kubernetes.io/projected/d771efda-eabb-43a6-b033-a798880231b1-kube-api-access-xjdtw\") pod \"iptables-alerter-z7qf2\" (UID: \"d771efda-eabb-43a6-b033-a798880231b1\") " pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.155854 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.155837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlmb\" (UniqueName: \"kubernetes.io/projected/acc3c714-ca80-45fe-a1b0-14e012c3d912-kube-api-access-4nlmb\") pod \"ovnkube-node-tpwrl\" (UID: \"acc3c714-ca80-45fe-a1b0-14e012c3d912\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.156774 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.156742 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5d2v\" (UniqueName: \"kubernetes.io/projected/5b37faae-6be3-4973-8048-7a21fab3256d-kube-api-access-h5d2v\") pod \"multus-8vkqt\" (UID: \"5b37faae-6be3-4973-8048-7a21fab3256d\") " pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.159280 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.159249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64mxc\" (UniqueName: \"kubernetes.io/projected/17f3f47e-5b1b-4dbc-92e0-750d9afbad29-kube-api-access-64mxc\") pod \"aws-ebs-csi-driver-node-6qfhl\" (UID: \"17f3f47e-5b1b-4dbc-92e0-750d9afbad29\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.244592 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-kubernetes\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-systemd\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-run\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-systemd\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-kubernetes\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-sys\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/96482cfc-a3ad-4187-a8c4-419fcd27a81f-konnectivity-ca\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-run\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-tmp\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b24gg\" (UniqueName: \"kubernetes.io/projected/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-kube-api-access-b24gg\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-host\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-serviceca\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244835 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83519001-bdda-4c9d-ab90-db32b4638392-tmp-dir\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-conf\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysconfig\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.244950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfzw5\" (UniqueName: \"kubernetes.io/projected/06f15cdd-966a-45fd-8c56-bb8afca86d95-kube-api-access-dfzw5\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-lib-modules\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2vv\" (UniqueName: \"kubernetes.io/projected/83519001-bdda-4c9d-ab90-db32b4638392-kube-api-access-4d2vv\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-conf\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-host\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-modprobe-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245145 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-var-lib-kubelet\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-tuned\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83519001-bdda-4c9d-ab90-db32b4638392-hosts-file\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb6sm\" (UniqueName: \"kubernetes.io/projected/0145db4f-d1c7-42f4-8607-b305371c3756-kube-api-access-jb6sm\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83519001-bdda-4c9d-ab90-db32b4638392-tmp-dir\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/96482cfc-a3ad-4187-a8c4-419fcd27a81f-agent-certs\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.245252 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245261 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/96482cfc-a3ad-4187-a8c4-419fcd27a81f-konnectivity-ca\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysconfig\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-host\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.245389 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245301 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-sys\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.245345 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.74530302 +0000 UTC m=+3.086302097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-serviceca\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83519001-bdda-4c9d-ab90-db32b4638392-hosts-file\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-modprobe-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245376 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-sysctl-d\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.244905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-host\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-lib-modules\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.246010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.245582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f15cdd-966a-45fd-8c56-bb8afca86d95-var-lib-kubelet\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.247882 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.247855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-tmp\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.247997 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.247898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06f15cdd-966a-45fd-8c56-bb8afca86d95-etc-tuned\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.248598 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.248579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/96482cfc-a3ad-4187-a8c4-419fcd27a81f-agent-certs\") pod \"konnectivity-agent-tdzw6\" (UID: \"96482cfc-a3ad-4187-a8c4-419fcd27a81f\") " pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.253856 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.253839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24gg\" (UniqueName: \"kubernetes.io/projected/a345adc2-0a7b-481f-ad9d-9acdfefd72d1-kube-api-access-b24gg\") pod \"node-ca-zff9d\" (UID: \"a345adc2-0a7b-481f-ad9d-9acdfefd72d1\") " pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.259644 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.259597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2vv\" (UniqueName: \"kubernetes.io/projected/83519001-bdda-4c9d-ab90-db32b4638392-kube-api-access-4d2vv\") pod \"node-resolver-8fsg9\" (UID: \"83519001-bdda-4c9d-ab90-db32b4638392\") " pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.259943 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.259919 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:45.260030 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.259944 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:45.260030 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.259960 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.260115 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.260037 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.760018446 +0000 UTC m=+3.101017510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.260895 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.260876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfzw5\" (UniqueName: \"kubernetes.io/projected/06f15cdd-966a-45fd-8c56-bb8afca86d95-kube-api-access-dfzw5\") pod \"tuned-mltbp\" (UID: \"06f15cdd-966a-45fd-8c56-bb8afca86d95\") " pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.260971 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.260960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb6sm\" (UniqueName: \"kubernetes.io/projected/0145db4f-d1c7-42f4-8607-b305371c3756-kube-api-access-jb6sm\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.337537 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.337509 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" Apr 22 17:34:45.343676 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.343658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8vkqt" Apr 22 17:34:45.354301 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.354282 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" Apr 22 17:34:45.358812 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.358795 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z7qf2" Apr 22 17:34:45.365419 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.365379 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:34:45.372031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.372001 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:34:45.379557 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.379540 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mltbp" Apr 22 17:34:45.386063 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.386047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8fsg9" Apr 22 17:34:45.390516 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.390494 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zff9d" Apr 22 17:34:45.748887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.748861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:45.749035 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.749001 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.749077 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.749058 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:46.749044202 +0000 UTC m=+4.090043251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.779480 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.779453 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc3c714_ca80_45fe_a1b0_14e012c3d912.slice/crio-154b8fcfdb5531b9b054740cef9613a863251649cdf21ed9ecdba79c55ebe3d2 WatchSource:0}: Error finding container 154b8fcfdb5531b9b054740cef9613a863251649cdf21ed9ecdba79c55ebe3d2: Status 404 returned error can't find the container with id 154b8fcfdb5531b9b054740cef9613a863251649cdf21ed9ecdba79c55ebe3d2 Apr 22 17:34:45.780877 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.780796 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda345adc2_0a7b_481f_ad9d_9acdfefd72d1.slice/crio-f2fe03320ebe0530ac79be27c7330a207bd8828832f3ef6a5d860bdb4a5d287a WatchSource:0}: Error finding container f2fe03320ebe0530ac79be27c7330a207bd8828832f3ef6a5d860bdb4a5d287a: Status 404 returned error can't find the container with id f2fe03320ebe0530ac79be27c7330a207bd8828832f3ef6a5d860bdb4a5d287a Apr 22 17:34:45.781918 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.781893 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd771efda_eabb_43a6_b033_a798880231b1.slice/crio-3e3665017af0ffcbc775aa313e59e11928de5d3bbf15952ac274de90ab53ec6b WatchSource:0}: Error finding container 3e3665017af0ffcbc775aa313e59e11928de5d3bbf15952ac274de90ab53ec6b: Status 404 returned error can't find the container with id 3e3665017af0ffcbc775aa313e59e11928de5d3bbf15952ac274de90ab53ec6b Apr 22 17:34:45.784199 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.784130 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83519001_bdda_4c9d_ab90_db32b4638392.slice/crio-1b65ea04424568a1a51ceb0e86ba38b6322c3f9e2affc80ee203e07e0123f75b WatchSource:0}: Error finding container 1b65ea04424568a1a51ceb0e86ba38b6322c3f9e2affc80ee203e07e0123f75b: Status 404 returned error can't find the container with id 1b65ea04424568a1a51ceb0e86ba38b6322c3f9e2affc80ee203e07e0123f75b Apr 22 17:34:45.785522 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.785454 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96482cfc_a3ad_4187_a8c4_419fcd27a81f.slice/crio-f9b8352339a421158412be57c3c93b3249f62fd9248e83346bebb1ad1fc67384 WatchSource:0}: Error finding container f9b8352339a421158412be57c3c93b3249f62fd9248e83346bebb1ad1fc67384: Status 404 returned error can't find the container with id f9b8352339a421158412be57c3c93b3249f62fd9248e83346bebb1ad1fc67384 Apr 22 17:34:45.787680 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.787647 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a77318f_12ee_48f6_8626_11e2875f970b.slice/crio-d177d9f2ebc328a765f88ee5ff2b652a1aad7dd727563919c95151d9b0a87d6d WatchSource:0}: Error finding container d177d9f2ebc328a765f88ee5ff2b652a1aad7dd727563919c95151d9b0a87d6d: Status 404 returned error can't find the container with id d177d9f2ebc328a765f88ee5ff2b652a1aad7dd727563919c95151d9b0a87d6d Apr 22 17:34:45.788655 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.788631 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06f15cdd_966a_45fd_8c56_bb8afca86d95.slice/crio-9af017956750fffd178c3935a20c6f44c433f66ac93aea1fdd4ab8987ef30f89 WatchSource:0}: Error finding container 9af017956750fffd178c3935a20c6f44c433f66ac93aea1fdd4ab8987ef30f89: Status 404 returned error can't find the container with id 9af017956750fffd178c3935a20c6f44c433f66ac93aea1fdd4ab8987ef30f89 Apr 22 17:34:45.789467 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.789432 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b37faae_6be3_4973_8048_7a21fab3256d.slice/crio-73d1e317b2632767bf170883337f4241f06e5973f94350a9a7c681268966bbe7 WatchSource:0}: Error finding container 73d1e317b2632767bf170883337f4241f06e5973f94350a9a7c681268966bbe7: Status 404 returned error can't find the container with id 73d1e317b2632767bf170883337f4241f06e5973f94350a9a7c681268966bbe7 Apr 22 17:34:45.790977 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:34:45.790955 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f3f47e_5b1b_4dbc_92e0_750d9afbad29.slice/crio-67bc793287e3eaed05c8d695ffe3456df89a8bcd811051ace3acaec6f83ef2b1 WatchSource:0}: Error finding container 67bc793287e3eaed05c8d695ffe3456df89a8bcd811051ace3acaec6f83ef2b1: Status 404 returned error can't find the container with id 67bc793287e3eaed05c8d695ffe3456df89a8bcd811051ace3acaec6f83ef2b1 Apr 22 17:34:45.849673 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:45.849547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:45.849786 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.849688 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:45.849786 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.849707 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:45.849786 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.849715 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.849786 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:45.849770 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:34:46.849752267 +0000 UTC m=+4.190751333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:46.099928 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.099827 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:44 +0000 UTC" deadline="2027-11-10 22:58:45.957706156 +0000 UTC" Apr 22 17:34:46.099928 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.099858 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13613h23m59.857851739s" Apr 22 17:34:46.171474 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.171433 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdzw6" event={"ID":"96482cfc-a3ad-4187-a8c4-419fcd27a81f","Type":"ContainerStarted","Data":"f9b8352339a421158412be57c3c93b3249f62fd9248e83346bebb1ad1fc67384"} Apr 22 17:34:46.175164 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.175135 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z7qf2" event={"ID":"d771efda-eabb-43a6-b033-a798880231b1","Type":"ContainerStarted","Data":"3e3665017af0ffcbc775aa313e59e11928de5d3bbf15952ac274de90ab53ec6b"} Apr 22 17:34:46.177999 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.177962 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"154b8fcfdb5531b9b054740cef9613a863251649cdf21ed9ecdba79c55ebe3d2"} Apr 22 17:34:46.181089 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.180318 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" event={"ID":"f35ec5c62ed7016564c2db426231f954","Type":"ContainerStarted","Data":"82aafade916c62abf35f64c0ed05637259038a209aeef326e3a35b0000ffef84"} Apr 22 17:34:46.186166 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.186130 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8vkqt" event={"ID":"5b37faae-6be3-4973-8048-7a21fab3256d","Type":"ContainerStarted","Data":"73d1e317b2632767bf170883337f4241f06e5973f94350a9a7c681268966bbe7"} Apr 22 17:34:46.193062 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.193039 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mltbp" event={"ID":"06f15cdd-966a-45fd-8c56-bb8afca86d95","Type":"ContainerStarted","Data":"9af017956750fffd178c3935a20c6f44c433f66ac93aea1fdd4ab8987ef30f89"} Apr 22 17:34:46.196617 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.195692 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-10.ec2.internal" podStartSLOduration=2.195678728 podStartE2EDuration="2.195678728s" podCreationTimestamp="2026-04-22 17:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:46.195374572 +0000 UTC m=+3.536373654" watchObservedRunningTime="2026-04-22 17:34:46.195678728 +0000 UTC m=+3.536677800" Apr 22 17:34:46.196891 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.196868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerStarted","Data":"d177d9f2ebc328a765f88ee5ff2b652a1aad7dd727563919c95151d9b0a87d6d"} Apr 22 17:34:46.201184 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.201158 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8fsg9" event={"ID":"83519001-bdda-4c9d-ab90-db32b4638392","Type":"ContainerStarted","Data":"1b65ea04424568a1a51ceb0e86ba38b6322c3f9e2affc80ee203e07e0123f75b"} Apr 22 17:34:46.203200 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.203155 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zff9d" event={"ID":"a345adc2-0a7b-481f-ad9d-9acdfefd72d1","Type":"ContainerStarted","Data":"f2fe03320ebe0530ac79be27c7330a207bd8828832f3ef6a5d860bdb4a5d287a"} Apr 22 17:34:46.205310 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.205287 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" event={"ID":"17f3f47e-5b1b-4dbc-92e0-750d9afbad29","Type":"ContainerStarted","Data":"67bc793287e3eaed05c8d695ffe3456df89a8bcd811051ace3acaec6f83ef2b1"} Apr 22 17:34:46.758019 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.757987 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:46.758146 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.758125 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:46.758201 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.758185 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:48.758166194 +0000 UTC m=+6.099165258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:46.859352 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:46.859312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:46.859531 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.859513 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:46.859601 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.859538 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:46.859601 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.859551 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:46.859711 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:46.859613 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:34:48.859593905 +0000 UTC m=+6.200592973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:47.166748 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:47.166006 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:47.166748 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:47.166130 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:47.166748 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:47.166581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:47.166748 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:47.166683 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:47.228643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:47.228609 2570 generic.go:358] "Generic (PLEG): container finished" podID="267d88ca40ef57075d73d52347c0c18c" containerID="9a4f5ce64a453b8c0fc9056613804e05eddc95262113a4b13f224ff7ae4219d0" exitCode=0 Apr 22 17:34:47.229325 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:47.229114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"267d88ca40ef57075d73d52347c0c18c","Type":"ContainerDied","Data":"9a4f5ce64a453b8c0fc9056613804e05eddc95262113a4b13f224ff7ae4219d0"} Apr 22 17:34:48.250149 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:48.249462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" event={"ID":"267d88ca40ef57075d73d52347c0c18c","Type":"ContainerStarted","Data":"ae179de72a8eeb441fd18f4139f8095b9dababd22b4579dfeb8dde25776e606f"} Apr 22 17:34:48.778088 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:48.778050 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:48.778246 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.778196 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:48.778295 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.778263 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:52.778247098 +0000 UTC m=+10.119246146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:48.879792 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:48.879208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:48.879792 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.879380 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:48.879792 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.879413 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:48.879792 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.879427 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:48.879792 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:48.879486 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:34:52.879469172 +0000 UTC m=+10.220468222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:49.163528 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:49.162892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:49.163528 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:49.163018 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:49.163528 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:49.162899 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:49.163528 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:49.163438 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:51.163594 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:51.163555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:51.163991 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:51.163699 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:51.164181 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:51.164162 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:51.164276 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:51.164257 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:52.114220 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.114165 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-10.ec2.internal" podStartSLOduration=8.114149505 podStartE2EDuration="8.114149505s" podCreationTimestamp="2026-04-22 17:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:48.263789515 +0000 UTC m=+5.604788589" watchObservedRunningTime="2026-04-22 17:34:52.114149505 +0000 UTC m=+9.455148577" Apr 22 17:34:52.114570 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.114547 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vl6r9"] Apr 22 17:34:52.118571 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.118550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.118668 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.118631 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:34:52.207603 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.207554 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-kubelet-config\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.208054 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.207671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-dbus\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.208054 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.207706 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308644 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.308607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-dbus\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.308661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.308706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-kubelet-config\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308862 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.308795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-kubelet-config\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308903 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.308891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd494ca2-41f2-43cc-bffc-b65701eba67a-dbus\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.308977 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.308966 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:52.309020 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.309011 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:52.80899813 +0000 UTC m=+10.149997179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:52.811674 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.811634 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:52.811857 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.811722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:52.811857 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.811844 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.811938 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.811904 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.811884891 +0000 UTC m=+18.152883946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.812327 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.812297 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:52.812474 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.812351 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.812336904 +0000 UTC m=+11.153335960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:52.912755 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:52.912676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:52.912924 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.912854 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:52.912924 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.912880 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:52.912924 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.912893 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:52.913075 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:52.912955 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.912936755 +0000 UTC m=+18.253935811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:53.164221 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:53.164353 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:53.164755 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:53.164849 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:53.164892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:53.165038 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:53.164958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:53.818016 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:53.817978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:53.818473 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:53.818115 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:53.818473 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:53.818174 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:55.818155954 +0000 UTC m=+13.159155010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:55.163494 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:55.163465 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:55.163905 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:55.163461 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:55.163905 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:55.163593 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:55.163905 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:55.163472 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:55.163905 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:55.163681 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:55.163905 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:55.163774 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:34:55.834539 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:55.834504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:55.834720 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:55.834619 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:55.834720 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:55.834677 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:59.834658866 +0000 UTC m=+17.175657930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:57.162589 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:57.162558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:57.162589 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:57.162572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:57.163080 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:57.162558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:57.163080 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:57.162696 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:57.163080 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:57.162774 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:57.163080 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:57.162842 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:34:59.163538 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:59.162781 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:34:59.163538 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:59.162919 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:34:59.163538 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:59.163315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:34:59.163538 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:59.163430 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:34:59.164180 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:59.164156 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:59.164271 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:59.164248 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:34:59.860239 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:34:59.860205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:34:59.860494 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:59.860339 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:59.860494 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:34:59.860416 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:35:07.860382036 +0000 UTC m=+25.201381085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:00.866311 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:00.866274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:00.866698 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.866462 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.866698 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.866527 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.866510999 +0000 UTC m=+34.207510048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.967217 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:00.967181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:00.967440 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.967421 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:00.967505 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.967444 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:00.967505 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.967454 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.967584 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:00.967517 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.967498685 +0000 UTC m=+34.308497751 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:01.163321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:01.163235 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:01.163491 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:01.163236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:01.163491 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:01.163375 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:01.163590 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:01.163483 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:01.163590 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:01.163239 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:01.163666 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:01.163654 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:03.163828 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:03.163794 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:03.164362 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:03.163878 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:03.164362 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:03.163946 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:03.164362 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:03.164104 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:03.164362 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:03.164131 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:03.164362 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:03.164223 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:04.278837 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.278678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" event={"ID":"17f3f47e-5b1b-4dbc-92e0-750d9afbad29","Type":"ContainerStarted","Data":"6cbe01bac16c7b3bebb4635d25b9392c53ecdaf6f56bc08adc815e1cacc19d09"} Apr 22 17:35:04.279834 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.279811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdzw6" event={"ID":"96482cfc-a3ad-4187-a8c4-419fcd27a81f","Type":"ContainerStarted","Data":"11d1ba40d04584bec833723571998bdef9593228eef0e2b4d8393a86421fd72c"} Apr 22 17:35:04.281639 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.281615 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"7b142ef0b6794c4517b213db0f9901bce7d5a75cf2a390084e90b5221584bd28"} Apr 22 17:35:04.281739 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.281643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"bb0024d6601b5072c8ffc3f2fa5f9e2a594d0dbf9d8113ca13687293c9ad11e9"} Apr 22 17:35:04.281739 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.281653 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"972c9bfb9bd48228b499de643e5795cab14fdb2d937a9e634cffef4526ba2a03"} Apr 22 17:35:04.282803 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.282783 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8vkqt" event={"ID":"5b37faae-6be3-4973-8048-7a21fab3256d","Type":"ContainerStarted","Data":"058f359c69eea14f047db01d46aade7d2b44ed92804f0c358fa7be7055e8b25e"} Apr 22 17:35:04.283950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.283927 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mltbp" event={"ID":"06f15cdd-966a-45fd-8c56-bb8afca86d95","Type":"ContainerStarted","Data":"44ece8ea9dfda6f58127a7dbb1ba1e7c8a842efb7d9609bd4da1bf575e2d6c7a"} Apr 22 17:35:04.285203 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.285164 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="4d3b38afd204f680dbd8096d3af38b7abef2497e23543d400093d55cbb0d561f" exitCode=0 Apr 22 17:35:04.285289 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.285251 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"4d3b38afd204f680dbd8096d3af38b7abef2497e23543d400093d55cbb0d561f"} Apr 22 17:35:04.286665 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.286639 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8fsg9" event={"ID":"83519001-bdda-4c9d-ab90-db32b4638392","Type":"ContainerStarted","Data":"84bcca7f69712fdd26f6e5dc9ac55c13a3464f0572a2345caeb28c66d5c9975e"} Apr 22 17:35:04.287785 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.287764 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zff9d" event={"ID":"a345adc2-0a7b-481f-ad9d-9acdfefd72d1","Type":"ContainerStarted","Data":"ae1a4b81b3984c2e6f9dfaf7df3f1202384e97112ca33d4c9a89e80f3cca17bc"} Apr 22 17:35:04.293637 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.293603 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tdzw6" podStartSLOduration=8.243285748 podStartE2EDuration="21.293593485s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.787430151 +0000 UTC m=+3.128429210" lastFinishedPulling="2026-04-22 17:34:58.837737898 +0000 UTC m=+16.178736947" observedRunningTime="2026-04-22 17:35:04.293116857 +0000 UTC m=+21.634115925" watchObservedRunningTime="2026-04-22 17:35:04.293593485 +0000 UTC m=+21.634592556" Apr 22 17:35:04.308863 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.308826 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8vkqt" podStartSLOduration=3.514725885 podStartE2EDuration="21.308816626s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.791531907 +0000 UTC m=+3.132530956" lastFinishedPulling="2026-04-22 17:35:03.585622629 +0000 UTC m=+20.926621697" observedRunningTime="2026-04-22 17:35:04.30869387 +0000 UTC m=+21.649692941" watchObservedRunningTime="2026-04-22 17:35:04.308816626 +0000 UTC m=+21.649815698" Apr 22 17:35:04.349262 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.349205 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mltbp" podStartSLOduration=3.878334603 podStartE2EDuration="21.349189859s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.791142029 +0000 UTC m=+3.132141100" lastFinishedPulling="2026-04-22 17:35:03.261997293 +0000 UTC m=+20.602996356" observedRunningTime="2026-04-22 17:35:04.348744243 +0000 UTC m=+21.689743316" watchObservedRunningTime="2026-04-22 17:35:04.349189859 +0000 UTC m=+21.690188930" Apr 22 17:35:04.378288 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.378239 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8fsg9" podStartSLOduration=3.903961727 podStartE2EDuration="21.378227266s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.786118762 +0000 UTC m=+3.127117819" lastFinishedPulling="2026-04-22 17:35:03.260384295 +0000 UTC m=+20.601383358" observedRunningTime="2026-04-22 17:35:04.377620768 +0000 UTC m=+21.718619839" watchObservedRunningTime="2026-04-22 17:35:04.378227266 +0000 UTC m=+21.719226338" Apr 22 17:35:04.932218 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:04.932198 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:35:05.098190 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.098027 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:35:04.932214601Z","UUID":"a0b66606-82f6-4aaa-b6f1-2c3f21be35ed","Handler":null,"Name":"","Endpoint":""} Apr 22 17:35:05.099750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.099729 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:35:05.099750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.099754 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:35:05.163388 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.163359 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:05.163529 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.163358 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:05.163529 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.163420 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:05.163610 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:05.163494 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:05.163653 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:05.163597 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:05.163691 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:05.163650 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:05.292129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.292092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" event={"ID":"17f3f47e-5b1b-4dbc-92e0-750d9afbad29","Type":"ContainerStarted","Data":"9e9844cc4354781f2e391744fa3ac0437d7f0e1433bcdfe72311eb588adbde62"} Apr 22 17:35:05.293623 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.293596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z7qf2" event={"ID":"d771efda-eabb-43a6-b033-a798880231b1","Type":"ContainerStarted","Data":"2b73ba6c356daf3fedeb62dd532cf6c46387660d2712ad98ca4852c1e1f555c5"} Apr 22 17:35:05.296480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.296427 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"12644683c1f782fb508fb40c6473e59678079b9d4d23af6d11dd39dc2e02b73f"} Apr 22 17:35:05.296480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.296460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"25eb382f2607bccf772335700d3c78093c4f9532af21fda5f16e12e500e0fd2f"} Apr 22 17:35:05.296480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.296470 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"1f91c503e5ac4d524d02b4476b8ca525a880745806fcdcfee096e4d02364b310"} Apr 22 17:35:05.321921 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.321882 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z7qf2" podStartSLOduration=4.897163797 podStartE2EDuration="22.321871762s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.784345909 +0000 UTC m=+3.125344974" lastFinishedPulling="2026-04-22 17:35:03.209053878 +0000 UTC m=+20.550052939" observedRunningTime="2026-04-22 17:35:05.321622727 +0000 UTC m=+22.662621799" watchObservedRunningTime="2026-04-22 17:35:05.321871762 +0000 UTC m=+22.662870876" Apr 22 17:35:05.322310 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:05.322275 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zff9d" podStartSLOduration=4.845203877 podStartE2EDuration="22.322266624s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.782947418 +0000 UTC m=+3.123946472" lastFinishedPulling="2026-04-22 17:35:03.260010154 +0000 UTC m=+20.601009219" observedRunningTime="2026-04-22 17:35:04.402820829 +0000 UTC m=+21.743819900" watchObservedRunningTime="2026-04-22 17:35:05.322266624 +0000 UTC m=+22.663265695" Apr 22 17:35:06.300213 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:06.300173 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" event={"ID":"17f3f47e-5b1b-4dbc-92e0-750d9afbad29","Type":"ContainerStarted","Data":"0a633f70706d9aef02577d973f3f11b8bb1b7bd9940f3ab7a1ed010c25ebac34"} Apr 22 17:35:06.329217 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:06.329159 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6qfhl" podStartSLOduration=3.323627138 podStartE2EDuration="23.329141464s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.792522112 +0000 UTC m=+3.133521160" lastFinishedPulling="2026-04-22 17:35:05.798036424 +0000 UTC m=+23.139035486" observedRunningTime="2026-04-22 17:35:06.328792085 +0000 UTC m=+23.669791179" watchObservedRunningTime="2026-04-22 17:35:06.329141464 +0000 UTC m=+23.670140538" Apr 22 17:35:07.163224 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:07.162988 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:07.163224 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:07.163038 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:07.163476 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:07.163053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:07.163476 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:07.163279 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:07.163476 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:07.163421 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:07.163589 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:07.163517 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:07.305317 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:07.305279 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"178762551fbac87d6b84236e9eb7efa9fcc1eb2658a40a2878d7f178c8c88767"} Apr 22 17:35:07.918457 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:07.918357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:07.918604 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:07.918512 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:07.918604 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:07.918581 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret podName:dd494ca2-41f2-43cc-bffc-b65701eba67a nodeName:}" failed. No retries permitted until 2026-04-22 17:35:23.918562094 +0000 UTC m=+41.259561143 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret") pod "global-pull-secret-syncer-vl6r9" (UID: "dd494ca2-41f2-43cc-bffc-b65701eba67a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:08.341689 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:08.341652 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:35:08.342367 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:08.342346 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:35:09.163129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.163102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:09.163129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.163123 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:09.163270 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.163111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:09.163270 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:09.163207 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:09.163365 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:09.163289 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:09.163440 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:09.163381 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:09.310216 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.309920 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="463655b3baf3195315c5bcab820a03262556d9a10da703d16b6aa418834b242d" exitCode=0 Apr 22 17:35:09.310381 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.309987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"463655b3baf3195315c5bcab820a03262556d9a10da703d16b6aa418834b242d"} Apr 22 17:35:09.313863 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.313831 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" event={"ID":"acc3c714-ca80-45fe-a1b0-14e012c3d912","Type":"ContainerStarted","Data":"38355828561d8bad054d36761f6adcf9dbad9e09bc2c6895e917f877251d44a1"} Apr 22 17:35:09.314089 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.314070 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:35:09.314722 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.314707 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tdzw6" Apr 22 17:35:09.392086 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:09.392038 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" podStartSLOduration=8.265714563 podStartE2EDuration="26.39202545s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.78126765 +0000 UTC m=+3.122266698" lastFinishedPulling="2026-04-22 17:35:03.907578522 +0000 UTC m=+21.248577585" observedRunningTime="2026-04-22 17:35:09.375796816 +0000 UTC m=+26.716795887" watchObservedRunningTime="2026-04-22 17:35:09.39202545 +0000 UTC m=+26.733024520" Apr 22 17:35:10.315676 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:10.315643 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:10.315676 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:10.315681 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:10.315882 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:10.315690 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:10.332482 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:10.332455 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:10.332900 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:10.332879 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:11.162582 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.162550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:11.163067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.162550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:11.163067 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.162650 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:11.163067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.162548 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:11.163067 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.162724 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:11.163067 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.162790 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:11.318970 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.318934 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="5b55a30dd856cab9f277000fb1e83b566f979235b6a6a4767411d5aca162e2a2" exitCode=0 Apr 22 17:35:11.319116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.319023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"5b55a30dd856cab9f277000fb1e83b566f979235b6a6a4767411d5aca162e2a2"} Apr 22 17:35:11.352262 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.352238 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52lnt"] Apr 22 17:35:11.352366 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.352322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:11.352456 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.352433 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:11.354242 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.354221 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vl6r9"] Apr 22 17:35:11.354336 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.354283 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:11.354384 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.354347 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:11.361020 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.360992 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srjdz"] Apr 22 17:35:11.361133 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:11.361119 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:11.361232 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:11.361212 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:13.164442 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:13.164416 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:13.164746 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:13.164446 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:13.164746 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:13.164537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:13.164746 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:13.164553 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:13.164746 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:13.164637 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:13.164939 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:13.164745 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:13.325174 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:13.325138 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="d86ed4007156cee79604397f4cd2e103366439df0cbb5f7fee48fb7513c97c7a" exitCode=0 Apr 22 17:35:13.325174 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:13.325177 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"d86ed4007156cee79604397f4cd2e103366439df0cbb5f7fee48fb7513c97c7a"} Apr 22 17:35:15.163185 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:15.162982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:15.163560 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:15.163267 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:15.163560 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:15.163032 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:15.163560 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:15.163341 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:15.163560 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:15.163002 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:15.163560 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:15.163389 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:16.888643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:16.888610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:16.889066 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.888804 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:16.889066 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.888895 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:48.888871879 +0000 UTC m=+66.229870929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:16.989566 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:16.989540 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:16.989709 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.989689 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:16.989709 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.989712 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:16.989857 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.989721 2570 projected.go:194] Error preparing data for projected volume kube-api-access-xvfgw for pod openshift-network-diagnostics/network-check-target-52lnt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:16.989857 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:16.989771 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw podName:30f6ba3b-ef51-43d7-985c-46db837889ed nodeName:}" failed. No retries permitted until 2026-04-22 17:35:48.989757956 +0000 UTC m=+66.330757005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xvfgw" (UniqueName: "kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw") pod "network-check-target-52lnt" (UID: "30f6ba3b-ef51-43d7-985c-46db837889ed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:17.162845 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.162770 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:17.163000 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.162775 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:17.163000 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.162893 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52lnt" podUID="30f6ba3b-ef51-43d7-985c-46db837889ed" Apr 22 17:35:17.163000 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.162774 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:17.163000 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.162988 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:35:17.163184 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.163038 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vl6r9" podUID="dd494ca2-41f2-43cc-bffc-b65701eba67a" Apr 22 17:35:17.538413 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.538367 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-10.ec2.internal" event="NodeReady" Apr 22 17:35:17.538588 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.538529 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:35:17.584570 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.584532 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s"] Apr 22 17:35:17.613417 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.613369 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:35:17.613547 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.613525 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.622191 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.622165 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4bjzc\"" Apr 22 17:35:17.622684 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.622647 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:35:17.622684 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.622661 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:35:17.639170 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.639134 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s"] Apr 22 17:35:17.639170 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.639167 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hknm7"] Apr 22 17:35:17.639322 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.639305 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.643332 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.643310 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:35:17.643683 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.643310 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:35:17.643683 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.643653 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:35:17.648683 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.648661 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzdz4\"" Apr 22 17:35:17.657116 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.657092 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dvrrw"] Apr 22 17:35:17.657265 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.657244 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.660196 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.660178 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:35:17.662518 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.662497 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:35:17.663229 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.663209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m6rcd\"" Apr 22 17:35:17.665912 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.665894 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:35:17.681788 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.681769 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:35:17.681876 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.681796 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hknm7"] Apr 22 17:35:17.681876 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.681809 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dvrrw"] Apr 22 17:35:17.681955 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.681913 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.687075 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.687054 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:35:17.687285 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.687272 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:35:17.688363 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.687675 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:35:17.688363 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.687710 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dt854\"" Apr 22 17:35:17.694100 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.694081 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.694190 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.694110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.795430 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft2dz\" (UniqueName: \"kubernetes.io/projected/d3a68516-ea37-46c1-bb27-cb34ede968ac-kube-api-access-ft2dz\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.795430 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795379 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd141af-4f14-4224-b057-0cd35252fcd8-config-volume\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.795430 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd141af-4f14-4224-b057-0cd35252fcd8-tmp-dir\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.795654 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795654 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.795654 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.795654 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795801 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795676 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795801 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbkn\" (UniqueName: \"kubernetes.io/projected/edd141af-4f14-4224-b057-0cd35252fcd8-kube-api-access-mdbkn\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.795801 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795801 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.795950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795824 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjj4\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795863 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.795950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.795950 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.795937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.796761 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.796736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:17.796894 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.796853 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:17.796938 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.796908 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:18.296889416 +0000 UTC m=+35.637888468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:17.896518 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896487 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.896518 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft2dz\" (UniqueName: \"kubernetes.io/projected/d3a68516-ea37-46c1-bb27-cb34ede968ac-kube-api-access-ft2dz\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd141af-4f14-4224-b057-0cd35252fcd8-config-volume\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd141af-4f14-4224-b057-0cd35252fcd8-tmp-dir\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896711 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896738 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbkn\" (UniqueName: \"kubernetes.io/projected/edd141af-4f14-4224-b057-0cd35252fcd8-kube-api-access-mdbkn\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896858 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjj4\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.896898 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.896916 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.896922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.896958 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:18.396943088 +0000 UTC m=+35.737942138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.897014 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd141af-4f14-4224-b057-0cd35252fcd8-tmp-dir\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897248 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.897041 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.897095 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:18.397079445 +0000 UTC m=+35.738078496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.897163 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:17.897192 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:18.397181917 +0000 UTC m=+35.738180965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.897227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd141af-4f14-4224-b057-0cd35252fcd8-config-volume\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.897326 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.897479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.897951 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.897721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.901542 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.901520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.901661 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.901520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.907867 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.907840 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbkn\" (UniqueName: \"kubernetes.io/projected/edd141af-4f14-4224-b057-0cd35252fcd8-kube-api-access-mdbkn\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:17.908209 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.908185 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft2dz\" (UniqueName: \"kubernetes.io/projected/d3a68516-ea37-46c1-bb27-cb34ede968ac-kube-api-access-ft2dz\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:17.909448 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.909430 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjj4\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:17.909772 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:17.909754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:18.300330 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:18.300287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:18.300508 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.300479 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:18.300579 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.300558 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:19.300543548 +0000 UTC m=+36.641542597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:18.401667 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:18.401628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:18.401838 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:18.401694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:18.401838 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:18.401737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:18.401838 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401783 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:18.401838 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401801 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:18.402018 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401849 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:18.402018 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401854 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:19.401835621 +0000 UTC m=+36.742834690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:18.402018 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401852 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:18.402018 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401917 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:19.401901158 +0000 UTC m=+36.742900209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:18.402018 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:18.401946 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:19.401935065 +0000 UTC m=+36.742934146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:19.163500 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.163466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:19.163500 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.163482 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:19.164211 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.163482 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:19.166943 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.166915 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z4sh7\"" Apr 22 17:35:19.167849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.167535 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:35:19.167849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.167557 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:19.167849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.167626 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:19.168414 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.168203 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qs9jr\"" Apr 22 17:35:19.168414 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.168284 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:19.310023 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.309992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:19.310151 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.310135 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:19.310224 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.310215 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:21.3101972 +0000 UTC m=+38.651196250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:19.411128 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.411103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:19.411230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.411147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:19.411230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:19.411181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:19.411331 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411268 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:19.411331 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411283 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:19.411331 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411290 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:19.411472 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411339 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:21.411321889 +0000 UTC m=+38.752320956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:19.411472 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411354 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:21.411347551 +0000 UTC m=+38.752346600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:19.411472 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411386 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:19.411472 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:19.411448 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:21.411430513 +0000 UTC m=+38.752429575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:20.339199 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:20.339168 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="43e73e76efbe753adc879ef8d527c1210c82322f2fe6d79c0303a83588250410" exitCode=0 Apr 22 17:35:20.339668 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:20.339211 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"43e73e76efbe753adc879ef8d527c1210c82322f2fe6d79c0303a83588250410"} Apr 22 17:35:21.328177 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.328144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:21.328352 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.328285 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:21.328352 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.328344 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.328329454 +0000 UTC m=+42.669328502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:21.342899 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.342872 2570 generic.go:358] "Generic (PLEG): container finished" podID="7a77318f-12ee-48f6-8626-11e2875f970b" containerID="95b28c8926673243d2e240ddd7b7d7d213996d50bf9d366562554855afd6546c" exitCode=0 Apr 22 17:35:21.343182 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.342913 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerDied","Data":"95b28c8926673243d2e240ddd7b7d7d213996d50bf9d366562554855afd6546c"} Apr 22 17:35:21.428735 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.428710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:21.428845 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.428807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:21.428845 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.428824 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:21.428845 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.428844 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:21.428946 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:21.428854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:21.428946 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.428888 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.428873338 +0000 UTC m=+42.769872388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:21.428946 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.428931 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:21.429048 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.428966 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.428955522 +0000 UTC m=+42.769954571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:21.429048 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.429001 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:21.429048 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:21.429020 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.429012973 +0000 UTC m=+42.770012022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:22.347930 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:22.347884 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" event={"ID":"7a77318f-12ee-48f6-8626-11e2875f970b","Type":"ContainerStarted","Data":"0e557001a6c4d5a4bea683830ce82c05f3302b5d3066e46c8b8d1fe67a0f83c1"} Apr 22 17:35:22.375250 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:22.375211 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2z7l2" podStartSLOduration=5.811170447 podStartE2EDuration="39.37520049s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.789464613 +0000 UTC m=+3.130463663" lastFinishedPulling="2026-04-22 17:35:19.353494657 +0000 UTC m=+36.694493706" observedRunningTime="2026-04-22 17:35:22.373891678 +0000 UTC m=+39.714890749" watchObservedRunningTime="2026-04-22 17:35:22.37520049 +0000 UTC m=+39.716199560" Apr 22 17:35:23.944980 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:23.944940 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:23.948198 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:23.948170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd494ca2-41f2-43cc-bffc-b65701eba67a-original-pull-secret\") pod \"global-pull-secret-syncer-vl6r9\" (UID: \"dd494ca2-41f2-43cc-bffc-b65701eba67a\") " pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:23.981646 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:23.981621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vl6r9" Apr 22 17:35:24.142282 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:24.142248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vl6r9"] Apr 22 17:35:24.147956 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:35:24.147930 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd494ca2_41f2_43cc_bffc_b65701eba67a.slice/crio-a3013790d327f3f79d34295de9912abe26ec77078cf94b8020833cd7821ef0b3 WatchSource:0}: Error finding container a3013790d327f3f79d34295de9912abe26ec77078cf94b8020833cd7821ef0b3: Status 404 returned error can't find the container with id a3013790d327f3f79d34295de9912abe26ec77078cf94b8020833cd7821ef0b3 Apr 22 17:35:24.351562 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:24.351526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vl6r9" event={"ID":"dd494ca2-41f2-43cc-bffc-b65701eba67a","Type":"ContainerStarted","Data":"a3013790d327f3f79d34295de9912abe26ec77078cf94b8020833cd7821ef0b3"} Apr 22 17:35:25.357964 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:25.357927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:25.358354 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.358114 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:25.358354 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.358205 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:33.358182582 +0000 UTC m=+50.699181632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:25.458774 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:25.458742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:25.458774 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:25.458779 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:25.458974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:25.458862 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:25.458974 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.458890 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:25.458974 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.458947 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:25.458974 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.458954 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:25.458974 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.458973 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:25.459128 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.458951 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:33.45893022 +0000 UTC m=+50.799929287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:25.459128 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.459012 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:33.458997525 +0000 UTC m=+50.799996577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:25.459128 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:25.459022 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:33.459016094 +0000 UTC m=+50.800015142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:29.362798 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:29.362763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vl6r9" event={"ID":"dd494ca2-41f2-43cc-bffc-b65701eba67a","Type":"ContainerStarted","Data":"70b80485d259e1ee63ec7c3ddcf3d6c020b2b6157facb895a4315789a745fc6a"} Apr 22 17:35:29.388852 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:29.388806 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vl6r9" podStartSLOduration=33.014040438 podStartE2EDuration="37.388792027s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="2026-04-22 17:35:24.149718914 +0000 UTC m=+41.490717967" lastFinishedPulling="2026-04-22 17:35:28.524470505 +0000 UTC m=+45.865469556" observedRunningTime="2026-04-22 17:35:29.388103803 +0000 UTC m=+46.729102875" watchObservedRunningTime="2026-04-22 17:35:29.388792027 +0000 UTC m=+46.729791098" Apr 22 17:35:33.418671 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:33.418626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:33.419219 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.418764 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:33.419219 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.418837 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:35:49.418819673 +0000 UTC m=+66.759818721 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:33.519903 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:33.519875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:33.520044 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:33.519927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:33.520044 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:33.519947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:33.520044 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520025 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:33.520044 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520025 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:33.520044 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520031 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:33.520233 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520057 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:33.520233 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520088 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:35:49.520070245 +0000 UTC m=+66.861069294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:33.520233 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520107 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:49.520095451 +0000 UTC m=+66.861094499 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:33.520233 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:33.520121 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:49.520114689 +0000 UTC m=+66.861113739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:41.082889 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.082861 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2"] Apr 22 17:35:41.122745 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.122714 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2"] Apr 22 17:35:41.122896 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.122766 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.125508 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.125475 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:35:41.125645 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.125523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:35:41.126676 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.126653 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:35:41.126754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.126717 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:35:41.174869 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.174848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89c77394-bfa7-431a-bff0-f69f1cf7c185-tmp\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.175002 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.174881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/89c77394-bfa7-431a-bff0-f69f1cf7c185-klusterlet-config\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.175002 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.174991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2xs\" (UniqueName: \"kubernetes.io/projected/89c77394-bfa7-431a-bff0-f69f1cf7c185-kube-api-access-zb2xs\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.276176 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.276150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2xs\" (UniqueName: \"kubernetes.io/projected/89c77394-bfa7-431a-bff0-f69f1cf7c185-kube-api-access-zb2xs\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.276255 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.276225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89c77394-bfa7-431a-bff0-f69f1cf7c185-tmp\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.276305 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.276250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/89c77394-bfa7-431a-bff0-f69f1cf7c185-klusterlet-config\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.276649 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.276627 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89c77394-bfa7-431a-bff0-f69f1cf7c185-tmp\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.278825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.278797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/89c77394-bfa7-431a-bff0-f69f1cf7c185-klusterlet-config\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.284624 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.284600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2xs\" (UniqueName: \"kubernetes.io/projected/89c77394-bfa7-431a-bff0-f69f1cf7c185-kube-api-access-zb2xs\") pod \"klusterlet-addon-workmgr-74686c766b-8x8c2\" (UID: \"89c77394-bfa7-431a-bff0-f69f1cf7c185\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.432379 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.432317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:41.545340 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:41.545311 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2"] Apr 22 17:35:41.548116 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:35:41.548086 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c77394_bfa7_431a_bff0_f69f1cf7c185.slice/crio-362952d945e26d380c846d452ed1ab2ba7e74ede47172abd375625b1e6102ec3 WatchSource:0}: Error finding container 362952d945e26d380c846d452ed1ab2ba7e74ede47172abd375625b1e6102ec3: Status 404 returned error can't find the container with id 362952d945e26d380c846d452ed1ab2ba7e74ede47172abd375625b1e6102ec3 Apr 22 17:35:42.335332 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:42.335305 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpwrl" Apr 22 17:35:42.390205 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:42.390170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" event={"ID":"89c77394-bfa7-431a-bff0-f69f1cf7c185","Type":"ContainerStarted","Data":"362952d945e26d380c846d452ed1ab2ba7e74ede47172abd375625b1e6102ec3"} Apr 22 17:35:47.400848 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:47.400806 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" event={"ID":"89c77394-bfa7-431a-bff0-f69f1cf7c185","Type":"ContainerStarted","Data":"d63a2892229f338b1767b8e7743f352eae81dafcfb7ac1e845b35215824ed18f"} Apr 22 17:35:47.401264 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:47.401025 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:47.402692 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:47.402672 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:35:47.426464 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:47.426419 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" podStartSLOduration=1.0909001680000001 podStartE2EDuration="6.426384655s" podCreationTimestamp="2026-04-22 17:35:41 +0000 UTC" firstStartedPulling="2026-04-22 17:35:41.550001722 +0000 UTC m=+58.891000771" lastFinishedPulling="2026-04-22 17:35:46.885486206 +0000 UTC m=+64.226485258" observedRunningTime="2026-04-22 17:35:47.424879315 +0000 UTC m=+64.765878386" watchObservedRunningTime="2026-04-22 17:35:47.426384655 +0000 UTC m=+64.767383725" Apr 22 17:35:48.936792 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:48.936759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:35:48.939786 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:48.939758 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:48.947162 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:48.947146 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:48.947223 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:48.947202 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:52.947186707 +0000 UTC m=+130.288185757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : secret "metrics-daemon-secret" not found Apr 22 17:35:49.037430 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.037387 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:49.040519 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.040502 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:49.050255 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.050239 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:49.060897 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.060878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfgw\" (UniqueName: \"kubernetes.io/projected/30f6ba3b-ef51-43d7-985c-46db837889ed-kube-api-access-xvfgw\") pod \"network-check-target-52lnt\" (UID: \"30f6ba3b-ef51-43d7-985c-46db837889ed\") " pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:49.179849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.179829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z4sh7\"" Apr 22 17:35:49.187257 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.187214 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:49.306764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.306733 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52lnt"] Apr 22 17:35:49.309414 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:35:49.309366 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f6ba3b_ef51_43d7_985c_46db837889ed.slice/crio-cf6aac799c3520dec477ee19da8bd05ff977375f7c1a596b15adef0f1ad46f25 WatchSource:0}: Error finding container cf6aac799c3520dec477ee19da8bd05ff977375f7c1a596b15adef0f1ad46f25: Status 404 returned error can't find the container with id cf6aac799c3520dec477ee19da8bd05ff977375f7c1a596b15adef0f1ad46f25 Apr 22 17:35:49.405565 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.405534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52lnt" event={"ID":"30f6ba3b-ef51-43d7-985c-46db837889ed","Type":"ContainerStarted","Data":"cf6aac799c3520dec477ee19da8bd05ff977375f7c1a596b15adef0f1ad46f25"} Apr 22 17:35:49.440763 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.440704 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:35:49.440850 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.440832 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:49.440916 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.440905 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:36:21.440883622 +0000 UTC m=+98.781882671 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:35:49.541223 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.541193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:35:49.541368 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.541244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:35:49.541368 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:49.541275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:35:49.541368 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541348 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541373 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541381 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541395 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541473 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:21.541452847 +0000 UTC m=+98.882451907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541487 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:21.541481575 +0000 UTC m=+98.882480625 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:35:49.541553 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:35:49.541498 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:36:21.541492732 +0000 UTC m=+98.882491781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:35:52.411946 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:52.411867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52lnt" event={"ID":"30f6ba3b-ef51-43d7-985c-46db837889ed","Type":"ContainerStarted","Data":"82a23dec636ad1c0849972e95b8ef95037f755bee649b383d30a59f002222756"} Apr 22 17:35:52.412306 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:52.412031 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:35:52.430353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:35:52.430306 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-52lnt" podStartSLOduration=66.645813666 podStartE2EDuration="1m9.430295304s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:35:49.311357699 +0000 UTC m=+66.652356753" lastFinishedPulling="2026-04-22 17:35:52.095839338 +0000 UTC m=+69.436838391" observedRunningTime="2026-04-22 17:35:52.429960191 +0000 UTC m=+69.770959264" watchObservedRunningTime="2026-04-22 17:35:52.430295304 +0000 UTC m=+69.771294412" Apr 22 17:36:21.481743 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:21.481702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:36:21.482168 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.481843 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:36:21.482168 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.481916 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:37:25.481900618 +0000 UTC m=+162.822899667 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:36:21.582028 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:21.581980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:36:21.582028 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:21.582038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582121 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582145 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:21.582156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582168 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85664766d-jq2bw: secret "image-registry-tls" not found Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582172 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:37:25.582159237 +0000 UTC m=+162.923158286 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582251 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:36:21.582253 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582254 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls podName:4789d962-90d0-4f73-b359-b7df6a792bd5 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:25.582229089 +0000 UTC m=+162.923228140 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls") pod "image-registry-85664766d-jq2bw" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5") : secret "image-registry-tls" not found Apr 22 17:36:21.582511 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:21.582276 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls podName:edd141af-4f14-4224-b057-0cd35252fcd8 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:25.582267266 +0000 UTC m=+162.923266315 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls") pod "dns-default-hknm7" (UID: "edd141af-4f14-4224-b057-0cd35252fcd8") : secret "dns-default-metrics-tls" not found Apr 22 17:36:23.416424 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:23.416379 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-52lnt" Apr 22 17:36:53.004137 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:36:53.004078 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:36:53.004654 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:53.004225 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:36:53.004654 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:36:53.004291 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs podName:0145db4f-d1c7-42f4-8607-b305371c3756 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:55.004272847 +0000 UTC m=+252.345271896 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs") pod "network-metrics-daemon-srjdz" (UID: "0145db4f-d1c7-42f4-8607-b305371c3756") : secret "metrics-daemon-secret" not found Apr 22 17:37:11.382608 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.382580 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547"] Apr 22 17:37:11.385473 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.385455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.388003 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.387982 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:37:11.388972 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.388944 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.389067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.388967 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.389067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.388967 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8pzjm\"" Apr 22 17:37:11.390489 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.390467 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:37:11.398748 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.398727 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547"] Apr 22 17:37:11.432252 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.432225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ll8n\" (UniqueName: \"kubernetes.io/projected/bce37b62-bce4-4fb4-8842-12a172ca9af4-kube-api-access-5ll8n\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.432395 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.432282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bce37b62-bce4-4fb4-8842-12a172ca9af4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.432395 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.432372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.533325 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.533296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bce37b62-bce4-4fb4-8842-12a172ca9af4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.533493 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.533349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.533493 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.533387 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ll8n\" (UniqueName: \"kubernetes.io/projected/bce37b62-bce4-4fb4-8842-12a172ca9af4-kube-api-access-5ll8n\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.533607 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:11.533503 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:11.533607 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:11.533578 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:12.033558068 +0000 UTC m=+149.374557122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:11.534029 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.534004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bce37b62-bce4-4fb4-8842-12a172ca9af4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:11.542004 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:11.541985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ll8n\" (UniqueName: \"kubernetes.io/projected/bce37b62-bce4-4fb4-8842-12a172ca9af4-kube-api-access-5ll8n\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:12.036878 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:12.036848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:12.037042 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:12.036984 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:12.037083 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:12.037057 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:13.037041751 +0000 UTC m=+150.378040799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:13.044217 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:13.044173 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:13.044615 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:13.044311 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:13.044615 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:13.044371 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:15.044354191 +0000 UTC m=+152.385353240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:14.500108 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.500064 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6"] Apr 22 17:37:14.503101 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.503085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" Apr 22 17:37:14.505432 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.505394 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4pdbm\"" Apr 22 17:37:14.512252 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.512222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6"] Apr 22 17:37:14.555473 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.555448 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6788t\" (UniqueName: \"kubernetes.io/projected/28652a65-52cf-44f8-933b-e65ad4e46b2c-kube-api-access-6788t\") pod \"network-check-source-8894fc9bd-t54f6\" (UID: \"28652a65-52cf-44f8-933b-e65ad4e46b2c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" Apr 22 17:37:14.656658 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.656633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6788t\" (UniqueName: \"kubernetes.io/projected/28652a65-52cf-44f8-933b-e65ad4e46b2c-kube-api-access-6788t\") pod \"network-check-source-8894fc9bd-t54f6\" (UID: \"28652a65-52cf-44f8-933b-e65ad4e46b2c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" Apr 22 17:37:14.665390 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.665361 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6788t\" (UniqueName: \"kubernetes.io/projected/28652a65-52cf-44f8-933b-e65ad4e46b2c-kube-api-access-6788t\") pod \"network-check-source-8894fc9bd-t54f6\" (UID: \"28652a65-52cf-44f8-933b-e65ad4e46b2c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" Apr 22 17:37:14.811994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.811923 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" Apr 22 17:37:14.922479 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:14.922450 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6"] Apr 22 17:37:14.925047 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:14.925019 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28652a65_52cf_44f8_933b_e65ad4e46b2c.slice/crio-b77e67bfe72c04f574279c1acea3ce8ec1b66ae99b489260c85cfa6b1ef761f3 WatchSource:0}: Error finding container b77e67bfe72c04f574279c1acea3ce8ec1b66ae99b489260c85cfa6b1ef761f3: Status 404 returned error can't find the container with id b77e67bfe72c04f574279c1acea3ce8ec1b66ae99b489260c85cfa6b1ef761f3 Apr 22 17:37:15.059921 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:15.059884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:15.060049 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:15.060014 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:15.060131 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:15.060078 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:19.060060108 +0000 UTC m=+156.401059172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:15.564844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:15.564812 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" event={"ID":"28652a65-52cf-44f8-933b-e65ad4e46b2c","Type":"ContainerStarted","Data":"87ac00bc19af5c1263e004f9ef49e7479a5cd4e87ef5d1158503bf994161af9e"} Apr 22 17:37:15.564844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:15.564848 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" event={"ID":"28652a65-52cf-44f8-933b-e65ad4e46b2c","Type":"ContainerStarted","Data":"b77e67bfe72c04f574279c1acea3ce8ec1b66ae99b489260c85cfa6b1ef761f3"} Apr 22 17:37:15.580795 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:15.580748 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t54f6" podStartSLOduration=1.580731924 podStartE2EDuration="1.580731924s" podCreationTimestamp="2026-04-22 17:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:15.579974763 +0000 UTC m=+152.920973837" watchObservedRunningTime="2026-04-22 17:37:15.580731924 +0000 UTC m=+152.921730996" Apr 22 17:37:17.128466 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.128439 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j"] Apr 22 17:37:17.131279 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.131264 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" Apr 22 17:37:17.133972 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.133951 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:37:17.134528 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.134505 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-dhdlr\"" Apr 22 17:37:17.135498 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.135480 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:37:17.140056 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.140031 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j"] Apr 22 17:37:17.176067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.176048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5sv\" (UniqueName: \"kubernetes.io/projected/444ac662-374e-4a35-971b-15f8e3f58a16-kube-api-access-bc5sv\") pod \"migrator-74bb7799d9-ts42j\" (UID: \"444ac662-374e-4a35-971b-15f8e3f58a16\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" Apr 22 17:37:17.276632 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.276595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5sv\" (UniqueName: \"kubernetes.io/projected/444ac662-374e-4a35-971b-15f8e3f58a16-kube-api-access-bc5sv\") pod \"migrator-74bb7799d9-ts42j\" (UID: \"444ac662-374e-4a35-971b-15f8e3f58a16\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" Apr 22 17:37:17.286585 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.286562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5sv\" (UniqueName: \"kubernetes.io/projected/444ac662-374e-4a35-971b-15f8e3f58a16-kube-api-access-bc5sv\") pod \"migrator-74bb7799d9-ts42j\" (UID: \"444ac662-374e-4a35-971b-15f8e3f58a16\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" Apr 22 17:37:17.440373 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.440313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" Apr 22 17:37:17.549190 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.549153 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j"] Apr 22 17:37:17.552653 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:17.552628 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod444ac662_374e_4a35_971b_15f8e3f58a16.slice/crio-5907ef30c679d763f4cecb0636df0eacbb1ef50a7bf17fdb6ee2e09667cd5204 WatchSource:0}: Error finding container 5907ef30c679d763f4cecb0636df0eacbb1ef50a7bf17fdb6ee2e09667cd5204: Status 404 returned error can't find the container with id 5907ef30c679d763f4cecb0636df0eacbb1ef50a7bf17fdb6ee2e09667cd5204 Apr 22 17:37:17.569312 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:17.569290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" event={"ID":"444ac662-374e-4a35-971b-15f8e3f58a16","Type":"ContainerStarted","Data":"5907ef30c679d763f4cecb0636df0eacbb1ef50a7bf17fdb6ee2e09667cd5204"} Apr 22 17:37:18.572806 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:18.572778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" event={"ID":"444ac662-374e-4a35-971b-15f8e3f58a16","Type":"ContainerStarted","Data":"bd388d029e14fd597444b4c8d6de2ee447029127ad790d4cce387887eb196c47"} Apr 22 17:37:18.572806 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:18.572813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" event={"ID":"444ac662-374e-4a35-971b-15f8e3f58a16","Type":"ContainerStarted","Data":"d5dc4f0d3eea3110b84c43453230a3913144fd350071e828f6e4589ef82c47dd"} Apr 22 17:37:18.588765 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:18.588721 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ts42j" podStartSLOduration=0.703363974 podStartE2EDuration="1.588705541s" podCreationTimestamp="2026-04-22 17:37:17 +0000 UTC" firstStartedPulling="2026-04-22 17:37:17.554485065 +0000 UTC m=+154.895484118" lastFinishedPulling="2026-04-22 17:37:18.439826636 +0000 UTC m=+155.780825685" observedRunningTime="2026-04-22 17:37:18.58748068 +0000 UTC m=+155.928479751" watchObservedRunningTime="2026-04-22 17:37:18.588705541 +0000 UTC m=+155.929704613" Apr 22 17:37:18.627483 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:18.627465 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8fsg9_83519001-bdda-4c9d-ab90-db32b4638392/dns-node-resolver/0.log" Apr 22 17:37:19.093014 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:19.092977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:19.093183 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:19.093115 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:19.093183 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:19.093174 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:27.093157567 +0000 UTC m=+164.434156618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:19.825789 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:19.825754 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zff9d_a345adc2-0a7b-481f-ad9d-9acdfefd72d1/node-ca/0.log" Apr 22 17:37:20.626062 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:20.626016 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" podUID="d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae" Apr 22 17:37:20.650178 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:20.650151 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-85664766d-jq2bw" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" Apr 22 17:37:20.666258 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:20.666235 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hknm7" podUID="edd141af-4f14-4224-b057-0cd35252fcd8" Apr 22 17:37:20.691652 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:20.691632 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dvrrw" podUID="d3a68516-ea37-46c1-bb27-cb34ede968ac" Apr 22 17:37:21.580335 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:21.580304 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:37:21.580791 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:21.580304 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:21.580791 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:21.580305 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:22.186482 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:22.186449 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-srjdz" podUID="0145db4f-d1c7-42f4-8607-b305371c3756" Apr 22 17:37:25.539335 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.539299 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:37:25.539815 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:25.539458 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:37:25.539815 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:25.539528 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert podName:d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae nodeName:}" failed. No retries permitted until 2026-04-22 17:39:27.539512273 +0000 UTC m=+284.880511323 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fjn2s" (UID: "d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae") : secret "networking-console-plugin-cert" not found Apr 22 17:37:25.640251 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.640223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:25.640385 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.640266 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:37:25.640385 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.640287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:25.640385 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:25.640376 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:37:25.640523 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:25.640477 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert podName:d3a68516-ea37-46c1-bb27-cb34ede968ac nodeName:}" failed. No retries permitted until 2026-04-22 17:39:27.64045661 +0000 UTC m=+284.981455659 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert") pod "ingress-canary-dvrrw" (UID: "d3a68516-ea37-46c1-bb27-cb34ede968ac") : secret "canary-serving-cert" not found Apr 22 17:37:25.642677 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.642653 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/edd141af-4f14-4224-b057-0cd35252fcd8-metrics-tls\") pod \"dns-default-hknm7\" (UID: \"edd141af-4f14-4224-b057-0cd35252fcd8\") " pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:25.642780 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.642723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"image-registry-85664766d-jq2bw\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:25.784417 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.784379 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzdz4\"" Apr 22 17:37:25.784569 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.784384 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m6rcd\"" Apr 22 17:37:25.791673 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.791616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:25.791673 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.791634 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:25.917987 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.917864 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hknm7"] Apr 22 17:37:25.920439 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:25.920394 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd141af_4f14_4224_b057_0cd35252fcd8.slice/crio-52ea73e55a3705bb9822b2339f714d50210045b8965943266e27dc21e3d948a3 WatchSource:0}: Error finding container 52ea73e55a3705bb9822b2339f714d50210045b8965943266e27dc21e3d948a3: Status 404 returned error can't find the container with id 52ea73e55a3705bb9822b2339f714d50210045b8965943266e27dc21e3d948a3 Apr 22 17:37:25.936077 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:25.936056 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:37:25.938642 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:25.938619 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4789d962_90d0_4f73_b359_b7df6a792bd5.slice/crio-e8b0c9c05835644b4175e7ac68c7a391a9f4a9601053471809e82e9dbed50309 WatchSource:0}: Error finding container e8b0c9c05835644b4175e7ac68c7a391a9f4a9601053471809e82e9dbed50309: Status 404 returned error can't find the container with id e8b0c9c05835644b4175e7ac68c7a391a9f4a9601053471809e82e9dbed50309 Apr 22 17:37:26.594718 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:26.594679 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hknm7" event={"ID":"edd141af-4f14-4224-b057-0cd35252fcd8","Type":"ContainerStarted","Data":"52ea73e55a3705bb9822b2339f714d50210045b8965943266e27dc21e3d948a3"} Apr 22 17:37:26.596159 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:26.596129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85664766d-jq2bw" event={"ID":"4789d962-90d0-4f73-b359-b7df6a792bd5","Type":"ContainerStarted","Data":"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186"} Apr 22 17:37:26.596284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:26.596166 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85664766d-jq2bw" event={"ID":"4789d962-90d0-4f73-b359-b7df6a792bd5","Type":"ContainerStarted","Data":"e8b0c9c05835644b4175e7ac68c7a391a9f4a9601053471809e82e9dbed50309"} Apr 22 17:37:26.596352 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:26.596317 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:26.620378 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:26.620324 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85664766d-jq2bw" podStartSLOduration=163.620312666 podStartE2EDuration="2m43.620312666s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:26.619393672 +0000 UTC m=+163.960392744" watchObservedRunningTime="2026-04-22 17:37:26.620312666 +0000 UTC m=+163.961311738" Apr 22 17:37:27.153930 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:27.153893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:27.154085 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:27.154035 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:27.154125 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:27.154112 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls podName:bce37b62-bce4-4fb4-8842-12a172ca9af4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:43.154096773 +0000 UTC m=+180.495095822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p4547" (UID: "bce37b62-bce4-4fb4-8842-12a172ca9af4") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:37:27.600568 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:27.600535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hknm7" event={"ID":"edd141af-4f14-4224-b057-0cd35252fcd8","Type":"ContainerStarted","Data":"38c38e8ba878e6a45943b4acf0a0bf2fdcd4cfaf49f7b32469ca88d525c8e9bb"} Apr 22 17:37:27.600568 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:27.600573 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hknm7" event={"ID":"edd141af-4f14-4224-b057-0cd35252fcd8","Type":"ContainerStarted","Data":"1e9902bd8666753006b957ac2437e0a676acf27a029608c6ef708c2a54ba1d30"} Apr 22 17:37:27.618240 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:27.618197 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hknm7" podStartSLOduration=129.32933853 podStartE2EDuration="2m10.618184973s" podCreationTimestamp="2026-04-22 17:35:17 +0000 UTC" firstStartedPulling="2026-04-22 17:37:25.92227345 +0000 UTC m=+163.263272499" lastFinishedPulling="2026-04-22 17:37:27.211119887 +0000 UTC m=+164.552118942" observedRunningTime="2026-04-22 17:37:27.617554006 +0000 UTC m=+164.958553079" watchObservedRunningTime="2026-04-22 17:37:27.618184973 +0000 UTC m=+164.959184043" Apr 22 17:37:28.603557 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:28.603528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:35.163060 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:35.162976 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:37:36.162932 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:36.162897 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:37:38.609529 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:38.609500 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hknm7" Apr 22 17:37:42.039939 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.039909 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rr4td"] Apr 22 17:37:42.046039 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.046022 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.055512 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.055492 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:37:42.056342 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.056327 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:37:42.056641 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.056626 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:37:42.056741 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.056700 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tt9tl\"" Apr 22 17:37:42.057115 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.057099 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:37:42.063149 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.063130 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rr4td"] Apr 22 17:37:42.161496 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.161468 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20f22498-ba1e-4fad-8fc7-110f430def54-data-volume\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.161622 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.161519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20f22498-ba1e-4fad-8fc7-110f430def54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.161622 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.161544 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794z7\" (UniqueName: \"kubernetes.io/projected/20f22498-ba1e-4fad-8fc7-110f430def54-kube-api-access-794z7\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.161622 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.161590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20f22498-ba1e-4fad-8fc7-110f430def54-crio-socket\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.161752 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.161681 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20f22498-ba1e-4fad-8fc7-110f430def54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.262840 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.262810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-794z7\" (UniqueName: \"kubernetes.io/projected/20f22498-ba1e-4fad-8fc7-110f430def54-kube-api-access-794z7\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.262966 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.262847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20f22498-ba1e-4fad-8fc7-110f430def54-crio-socket\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.262966 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.262895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20f22498-ba1e-4fad-8fc7-110f430def54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.263081 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.262981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20f22498-ba1e-4fad-8fc7-110f430def54-data-volume\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.263081 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.262998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/20f22498-ba1e-4fad-8fc7-110f430def54-crio-socket\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.263081 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.263056 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20f22498-ba1e-4fad-8fc7-110f430def54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.263346 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.263327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/20f22498-ba1e-4fad-8fc7-110f430def54-data-volume\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.263581 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.263560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/20f22498-ba1e-4fad-8fc7-110f430def54-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.265515 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.265496 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/20f22498-ba1e-4fad-8fc7-110f430def54-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.271525 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.271507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-794z7\" (UniqueName: \"kubernetes.io/projected/20f22498-ba1e-4fad-8fc7-110f430def54-kube-api-access-794z7\") pod \"insights-runtime-extractor-rr4td\" (UID: \"20f22498-ba1e-4fad-8fc7-110f430def54\") " pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.354994 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.354936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rr4td" Apr 22 17:37:42.474876 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.474840 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rr4td"] Apr 22 17:37:42.478934 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:42.478905 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f22498_ba1e_4fad_8fc7_110f430def54.slice/crio-06d6ed2cfead1cb74afaba364f1470a91205303e93df52bfd962162b479bf1fb WatchSource:0}: Error finding container 06d6ed2cfead1cb74afaba364f1470a91205303e93df52bfd962162b479bf1fb: Status 404 returned error can't find the container with id 06d6ed2cfead1cb74afaba364f1470a91205303e93df52bfd962162b479bf1fb Apr 22 17:37:42.638861 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.638780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rr4td" event={"ID":"20f22498-ba1e-4fad-8fc7-110f430def54","Type":"ContainerStarted","Data":"261e82f91b92839d36ce0795f3225ab07c77d04eb4ee42f1ef65809fe8f12217"} Apr 22 17:37:42.638861 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:42.638817 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rr4td" event={"ID":"20f22498-ba1e-4fad-8fc7-110f430def54","Type":"ContainerStarted","Data":"06d6ed2cfead1cb74afaba364f1470a91205303e93df52bfd962162b479bf1fb"} Apr 22 17:37:43.170282 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.170255 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:43.172377 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.172353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bce37b62-bce4-4fb4-8842-12a172ca9af4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p4547\" (UID: \"bce37b62-bce4-4fb4-8842-12a172ca9af4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:43.196380 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.196363 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8pzjm\"" Apr 22 17:37:43.204317 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.204298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" Apr 22 17:37:43.314069 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.314035 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547"] Apr 22 17:37:43.316630 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:43.316572 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce37b62_bce4_4fb4_8842_12a172ca9af4.slice/crio-58c2d96d5b32d542a8b9d78eaab833f22d36d5a7b8acab7b9dd4944cd97fbd33 WatchSource:0}: Error finding container 58c2d96d5b32d542a8b9d78eaab833f22d36d5a7b8acab7b9dd4944cd97fbd33: Status 404 returned error can't find the container with id 58c2d96d5b32d542a8b9d78eaab833f22d36d5a7b8acab7b9dd4944cd97fbd33 Apr 22 17:37:43.643273 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.643188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rr4td" event={"ID":"20f22498-ba1e-4fad-8fc7-110f430def54","Type":"ContainerStarted","Data":"98a56616040318c9dcbf406ef805677ff75d8c0c31105b3b151cab718895cf7e"} Apr 22 17:37:43.644332 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:43.644298 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" event={"ID":"bce37b62-bce4-4fb4-8842-12a172ca9af4","Type":"ContainerStarted","Data":"58c2d96d5b32d542a8b9d78eaab833f22d36d5a7b8acab7b9dd4944cd97fbd33"} Apr 22 17:37:45.497523 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.497489 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w"] Apr 22 17:37:45.500716 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.500693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:45.503076 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.503054 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 17:37:45.503183 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.503081 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ph6pj\"" Apr 22 17:37:45.511931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.511912 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w"] Apr 22 17:37:45.589532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.589504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b8b995d5-3b09-42a1-976e-755867230655-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f756w\" (UID: \"b8b995d5-3b09-42a1-976e-755867230655\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:45.651014 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.650989 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rr4td" event={"ID":"20f22498-ba1e-4fad-8fc7-110f430def54","Type":"ContainerStarted","Data":"a6740cab8817834611c76e957d9c4c74dd9bda350c694b6ac55f11a1540feb60"} Apr 22 17:37:45.656334 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.656308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" event={"ID":"bce37b62-bce4-4fb4-8842-12a172ca9af4","Type":"ContainerStarted","Data":"e2e06b9f50c0089b836bcded3e0630f17eacd7766c1062f9930d7deb8c6849c8"} Apr 22 17:37:45.670849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.670803 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rr4td" podStartSLOduration=1.22544582 podStartE2EDuration="3.670792367s" podCreationTimestamp="2026-04-22 17:37:42 +0000 UTC" firstStartedPulling="2026-04-22 17:37:42.539847802 +0000 UTC m=+179.880846857" lastFinishedPulling="2026-04-22 17:37:44.985194351 +0000 UTC m=+182.326193404" observedRunningTime="2026-04-22 17:37:45.670436919 +0000 UTC m=+183.011435987" watchObservedRunningTime="2026-04-22 17:37:45.670792367 +0000 UTC m=+183.011791439" Apr 22 17:37:45.684445 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.684387 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p4547" podStartSLOduration=33.014607333 podStartE2EDuration="34.684376102s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:43.318456808 +0000 UTC m=+180.659455858" lastFinishedPulling="2026-04-22 17:37:44.988225574 +0000 UTC m=+182.329224627" observedRunningTime="2026-04-22 17:37:45.683968128 +0000 UTC m=+183.024967198" watchObservedRunningTime="2026-04-22 17:37:45.684376102 +0000 UTC m=+183.025375173" Apr 22 17:37:45.690276 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.690248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b8b995d5-3b09-42a1-976e-755867230655-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f756w\" (UID: \"b8b995d5-3b09-42a1-976e-755867230655\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:45.692805 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.692789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b8b995d5-3b09-42a1-976e-755867230655-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f756w\" (UID: \"b8b995d5-3b09-42a1-976e-755867230655\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:45.796216 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.796141 2570 patch_prober.go:28] interesting pod/image-registry-85664766d-jq2bw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:45.796320 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.796190 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-85664766d-jq2bw" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:45.810438 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.810417 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:45.923694 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:45.923666 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w"] Apr 22 17:37:45.926993 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:45.926964 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b995d5_3b09_42a1_976e_755867230655.slice/crio-1e997465ee562e4b09c5504694c7b5b740183585f544b703131a0d5d598730b9 WatchSource:0}: Error finding container 1e997465ee562e4b09c5504694c7b5b740183585f544b703131a0d5d598730b9: Status 404 returned error can't find the container with id 1e997465ee562e4b09c5504694c7b5b740183585f544b703131a0d5d598730b9 Apr 22 17:37:46.660493 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:46.660452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" event={"ID":"b8b995d5-3b09-42a1-976e-755867230655","Type":"ContainerStarted","Data":"1e997465ee562e4b09c5504694c7b5b740183585f544b703131a0d5d598730b9"} Apr 22 17:37:47.110947 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.110917 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:37:47.113900 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.113884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.116499 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.116481 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:37:47.116794 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.116777 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:37:47.117704 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.117683 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m79jb\"" Apr 22 17:37:47.117836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.117683 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:37:47.117836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.117688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:37:47.117836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.117787 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:37:47.117836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.117789 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:37:47.120314 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.120298 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:37:47.126517 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.126496 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:37:47.200930 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.200868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.200930 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.200901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.200930 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.200922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.201099 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.200952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.201099 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.201038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.201099 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.201073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjl2\" (UniqueName: \"kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302042 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjl2\" (UniqueName: \"kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302099 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302155 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302155 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302155 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.302287 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.302226 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.303457 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.303433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.303581 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.303480 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.303644 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.303594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.304617 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.304597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.304705 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.304644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.310460 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.310436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjl2\" (UniqueName: \"kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2\") pod \"console-758cdfbcf7-mklvx\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.402107 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.402076 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" podUID="89c77394-bfa7-431a-bff0-f69f1cf7c185" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 22 17:37:47.423186 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.423164 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:47.538189 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.538137 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:37:47.540616 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:47.540590 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9438de3a_0410_4e89_bafc_f5ed9bcb2c3f.slice/crio-8f843fd5618a593fbdc752d42445b03e1219bba8ad812ed56ce7144a752113c4 WatchSource:0}: Error finding container 8f843fd5618a593fbdc752d42445b03e1219bba8ad812ed56ce7144a752113c4: Status 404 returned error can't find the container with id 8f843fd5618a593fbdc752d42445b03e1219bba8ad812ed56ce7144a752113c4 Apr 22 17:37:47.604967 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.604947 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:37:47.664592 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.664561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" event={"ID":"b8b995d5-3b09-42a1-976e-755867230655","Type":"ContainerStarted","Data":"49e5daab91475ba00ce833fea3c052f1f6048f183f9330ba5d377bb23144b9ef"} Apr 22 17:37:47.664990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.664717 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:47.665824 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.665791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758cdfbcf7-mklvx" event={"ID":"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f","Type":"ContainerStarted","Data":"8f843fd5618a593fbdc752d42445b03e1219bba8ad812ed56ce7144a752113c4"} Apr 22 17:37:47.667176 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.667152 2570 generic.go:358] "Generic (PLEG): container finished" podID="89c77394-bfa7-431a-bff0-f69f1cf7c185" containerID="d63a2892229f338b1767b8e7743f352eae81dafcfb7ac1e845b35215824ed18f" exitCode=1 Apr 22 17:37:47.667288 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.667200 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" event={"ID":"89c77394-bfa7-431a-bff0-f69f1cf7c185","Type":"ContainerDied","Data":"d63a2892229f338b1767b8e7743f352eae81dafcfb7ac1e845b35215824ed18f"} Apr 22 17:37:47.667552 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.667536 2570 scope.go:117] "RemoveContainer" containerID="d63a2892229f338b1767b8e7743f352eae81dafcfb7ac1e845b35215824ed18f" Apr 22 17:37:47.669747 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.669728 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" Apr 22 17:37:47.701948 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:47.701902 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f756w" podStartSLOduration=1.741939833 podStartE2EDuration="2.701887035s" podCreationTimestamp="2026-04-22 17:37:45 +0000 UTC" firstStartedPulling="2026-04-22 17:37:45.928785179 +0000 UTC m=+183.269784231" lastFinishedPulling="2026-04-22 17:37:46.888732385 +0000 UTC m=+184.229731433" observedRunningTime="2026-04-22 17:37:47.683607052 +0000 UTC m=+185.024606123" watchObservedRunningTime="2026-04-22 17:37:47.701887035 +0000 UTC m=+185.042886107" Apr 22 17:37:48.672291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:48.672258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" event={"ID":"89c77394-bfa7-431a-bff0-f69f1cf7c185","Type":"ContainerStarted","Data":"2fde4af9bad28ce3a5095f32b26951084da0c1b7fce90f97f64d5a1efc031b95"} Apr 22 17:37:48.672852 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:48.672830 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:37:48.673449 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:48.673427 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74686c766b-8x8c2" Apr 22 17:37:50.678903 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:50.678861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758cdfbcf7-mklvx" event={"ID":"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f","Type":"ContainerStarted","Data":"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8"} Apr 22 17:37:50.697940 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:50.697889 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-758cdfbcf7-mklvx" podStartSLOduration=1.093624228 podStartE2EDuration="3.697874247s" podCreationTimestamp="2026-04-22 17:37:47 +0000 UTC" firstStartedPulling="2026-04-22 17:37:47.542283077 +0000 UTC m=+184.883282128" lastFinishedPulling="2026-04-22 17:37:50.146533096 +0000 UTC m=+187.487532147" observedRunningTime="2026-04-22 17:37:50.696810499 +0000 UTC m=+188.037809569" watchObservedRunningTime="2026-04-22 17:37:50.697874247 +0000 UTC m=+188.038873321" Apr 22 17:37:52.888563 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.888530 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4"] Apr 22 17:37:52.891681 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.891663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:52.895063 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.895040 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 17:37:52.895175 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.895081 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-57c7p\"" Apr 22 17:37:52.895175 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.895116 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:37:52.895175 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.895081 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:37:52.904743 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.904726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4"] Apr 22 17:37:52.918882 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.918863 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kdb9m"] Apr 22 17:37:52.921876 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.921862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:52.924127 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.924109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:37:52.924365 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.924351 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8rhq9\"" Apr 22 17:37:52.924958 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.924944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:37:52.926714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:52.926691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:37:53.042731 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-wtmp\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042731 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-metrics-client-ca\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042751 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-root\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-textfile\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-sys\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042852 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7c7\" (UniqueName: \"kubernetes.io/projected/6825a724-cd96-48d9-9d88-2e764cd2c29b-kube-api-access-zf7c7\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.042931 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042921 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-tls\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.043193 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9w6t\" (UniqueName: \"kubernetes.io/projected/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-kube-api-access-v9w6t\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.043193 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.043193 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.042972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.043193 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.043014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144007 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.143931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144007 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.143962 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-tls\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144007 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.143985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9w6t\" (UniqueName: \"kubernetes.io/projected/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-kube-api-access-v9w6t\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.144007 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.144291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.144291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-wtmp\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144291 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-wtmp\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144515 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144358 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-metrics-client-ca\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144515 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-root\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144515 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144477 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-textfile\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144674 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-root\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144674 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-sys\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144768 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144704 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7c7\" (UniqueName: \"kubernetes.io/projected/6825a724-cd96-48d9-9d88-2e764cd2c29b-kube-api-access-zf7c7\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.144768 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.144738 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.145052 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:53.144863 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 17:37:53.145052 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:37:53.144924 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls podName:9aea887a-f1ce-4e5a-acf6-9d80afe812bb nodeName:}" failed. No retries permitted until 2026-04-22 17:37:53.644904404 +0000 UTC m=+190.985903470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-8tbx4" (UID: "9aea887a-f1ce-4e5a-acf6-9d80afe812bb") : secret "openshift-state-metrics-tls" not found Apr 22 17:37:53.145052 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.145000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-metrics-client-ca\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.145218 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.145106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.145218 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.145151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.145218 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.145163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6825a724-cd96-48d9-9d88-2e764cd2c29b-sys\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.145462 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.145439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-textfile\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.146741 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.146707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.146867 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.146845 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6825a724-cd96-48d9-9d88-2e764cd2c29b-node-exporter-tls\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.147172 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.147136 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.151581 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.151557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9w6t\" (UniqueName: \"kubernetes.io/projected/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-kube-api-access-v9w6t\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.153330 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.153307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7c7\" (UniqueName: \"kubernetes.io/projected/6825a724-cd96-48d9-9d88-2e764cd2c29b-kube-api-access-zf7c7\") pod \"node-exporter-kdb9m\" (UID: \"6825a724-cd96-48d9-9d88-2e764cd2c29b\") " pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.233796 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.233765 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kdb9m" Apr 22 17:37:53.241526 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:53.241502 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6825a724_cd96_48d9_9d88_2e764cd2c29b.slice/crio-15dcbd53f89d050a1937a27a5a47f811929b57560dd844701d812843abbbc40d WatchSource:0}: Error finding container 15dcbd53f89d050a1937a27a5a47f811929b57560dd844701d812843abbbc40d: Status 404 returned error can't find the container with id 15dcbd53f89d050a1937a27a5a47f811929b57560dd844701d812843abbbc40d Apr 22 17:37:53.648843 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.648807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.651373 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.651342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aea887a-f1ce-4e5a-acf6-9d80afe812bb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8tbx4\" (UID: \"9aea887a-f1ce-4e5a-acf6-9d80afe812bb\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.687942 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.687912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdb9m" event={"ID":"6825a724-cd96-48d9-9d88-2e764cd2c29b","Type":"ContainerStarted","Data":"15dcbd53f89d050a1937a27a5a47f811929b57560dd844701d812843abbbc40d"} Apr 22 17:37:53.800242 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.800193 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" Apr 22 17:37:53.936021 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:53.935948 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4"] Apr 22 17:37:53.939385 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:53.939345 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aea887a_f1ce_4e5a_acf6_9d80afe812bb.slice/crio-7987f82d027e3404a7ada83952c73e9db731716218bbc71f0e6c1d2a3211b682 WatchSource:0}: Error finding container 7987f82d027e3404a7ada83952c73e9db731716218bbc71f0e6c1d2a3211b682: Status 404 returned error can't find the container with id 7987f82d027e3404a7ada83952c73e9db731716218bbc71f0e6c1d2a3211b682 Apr 22 17:37:54.693011 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:54.692976 2570 generic.go:358] "Generic (PLEG): container finished" podID="6825a724-cd96-48d9-9d88-2e764cd2c29b" containerID="9a9f0076657f99607e3b9e00d22da8070e4020bf2385c4cc8ba064577bee8293" exitCode=0 Apr 22 17:37:54.693176 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:54.693055 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdb9m" event={"ID":"6825a724-cd96-48d9-9d88-2e764cd2c29b","Type":"ContainerDied","Data":"9a9f0076657f99607e3b9e00d22da8070e4020bf2385c4cc8ba064577bee8293"} Apr 22 17:37:54.695262 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:54.695235 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" event={"ID":"9aea887a-f1ce-4e5a-acf6-9d80afe812bb","Type":"ContainerStarted","Data":"dedddbb678c6159d0db0c0c7e4c8102b24d8bec6f8aff0739b54b7eaeb85378b"} Apr 22 17:37:54.695371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:54.695265 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" event={"ID":"9aea887a-f1ce-4e5a-acf6-9d80afe812bb","Type":"ContainerStarted","Data":"7274c10a470c7be2b1b462091e05f083fd7d52788c9d44cdf83d672d17546acb"} Apr 22 17:37:54.695371 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:54.695279 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" event={"ID":"9aea887a-f1ce-4e5a-acf6-9d80afe812bb","Type":"ContainerStarted","Data":"7987f82d027e3404a7ada83952c73e9db731716218bbc71f0e6c1d2a3211b682"} Apr 22 17:37:55.700270 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.700213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdb9m" event={"ID":"6825a724-cd96-48d9-9d88-2e764cd2c29b","Type":"ContainerStarted","Data":"a0a6edba2fa3194cbdf0ee9f04efd62942498d9d17f625706cac11c7d6da6437"} Apr 22 17:37:55.700270 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.700257 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdb9m" event={"ID":"6825a724-cd96-48d9-9d88-2e764cd2c29b","Type":"ContainerStarted","Data":"300f52668e8caa3d57158e98141ae7fe8d8b87c9b166404d3b3ed359d7a68ef0"} Apr 22 17:37:55.701973 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.701947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" event={"ID":"9aea887a-f1ce-4e5a-acf6-9d80afe812bb","Type":"ContainerStarted","Data":"1b31813547e8c3a252b2cf712dc00bf694ea4f4d720ba81bd3011e83e2fd61b0"} Apr 22 17:37:55.721675 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.721628 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kdb9m" podStartSLOduration=2.755553024 podStartE2EDuration="3.721615015s" podCreationTimestamp="2026-04-22 17:37:52 +0000 UTC" firstStartedPulling="2026-04-22 17:37:53.243159236 +0000 UTC m=+190.584158288" lastFinishedPulling="2026-04-22 17:37:54.209221231 +0000 UTC m=+191.550220279" observedRunningTime="2026-04-22 17:37:55.72012134 +0000 UTC m=+193.061120412" watchObservedRunningTime="2026-04-22 17:37:55.721615015 +0000 UTC m=+193.062614080" Apr 22 17:37:55.737864 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.737821 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8tbx4" podStartSLOduration=2.877232646 podStartE2EDuration="3.737809314s" podCreationTimestamp="2026-04-22 17:37:52 +0000 UTC" firstStartedPulling="2026-04-22 17:37:54.222954434 +0000 UTC m=+191.563953483" lastFinishedPulling="2026-04-22 17:37:55.083531097 +0000 UTC m=+192.424530151" observedRunningTime="2026-04-22 17:37:55.737066269 +0000 UTC m=+193.078065362" watchObservedRunningTime="2026-04-22 17:37:55.737809314 +0000 UTC m=+193.078808385" Apr 22 17:37:55.940322 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.940292 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-84777b654d-wqzfk"] Apr 22 17:37:55.944101 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.944081 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:55.947744 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.947724 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 17:37:55.947890 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.947733 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 17:37:55.947992 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.947975 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 17:37:55.948312 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.948290 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 17:37:55.948432 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.948415 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ffh9tg6n82kkf\"" Apr 22 17:37:55.948490 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.948441 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 17:37:55.948746 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.948725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-lcsm8\"" Apr 22 17:37:55.957787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:55.957762 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84777b654d-wqzfk"] Apr 22 17:37:56.067643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-grpc-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e8ff30-4f30-4a45-8ffc-69de18231068-metrics-client-ca\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067808 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.067836 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4kq4\" (UniqueName: \"kubernetes.io/projected/e3e8ff30-4f30-4a45-8ffc-69de18231068-kube-api-access-g4kq4\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.068044 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.067852 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169215 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e8ff30-4f30-4a45-8ffc-69de18231068-metrics-client-ca\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169215 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169435 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4kq4\" (UniqueName: \"kubernetes.io/projected/e3e8ff30-4f30-4a45-8ffc-69de18231068-kube-api-access-g4kq4\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169435 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169435 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169580 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169580 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-grpc-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.169580 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.169523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.170108 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.170085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e8ff30-4f30-4a45-8ffc-69de18231068-metrics-client-ca\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172074 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172188 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172130 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172505 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172583 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172689 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-thanos-querier-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.172734 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.172721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e3e8ff30-4f30-4a45-8ffc-69de18231068-secret-grpc-tls\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.180946 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.180928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4kq4\" (UniqueName: \"kubernetes.io/projected/e3e8ff30-4f30-4a45-8ffc-69de18231068-kube-api-access-g4kq4\") pod \"thanos-querier-84777b654d-wqzfk\" (UID: \"e3e8ff30-4f30-4a45-8ffc-69de18231068\") " pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.253760 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.253731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:37:56.375119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.375045 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-84777b654d-wqzfk"] Apr 22 17:37:56.377167 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:56.377140 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e8ff30_4f30_4a45_8ffc_69de18231068.slice/crio-bdec642c3824577207023add830d0686a720b549df0fe0ecf731195839cd2ba4 WatchSource:0}: Error finding container bdec642c3824577207023add830d0686a720b549df0fe0ecf731195839cd2ba4: Status 404 returned error can't find the container with id bdec642c3824577207023add830d0686a720b549df0fe0ecf731195839cd2ba4 Apr 22 17:37:56.706267 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:56.706186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"bdec642c3824577207023add830d0686a720b549df0fe0ecf731195839cd2ba4"} Apr 22 17:37:57.229513 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.229481 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56745797b9-6n5gh"] Apr 22 17:37:57.232879 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.232858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.235456 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.235414 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 17:37:57.235565 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.235420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5hrbw\"" Apr 22 17:37:57.235565 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.235500 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 17:37:57.235701 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.235418 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 17:37:57.236480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.236463 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:37:57.236607 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.236582 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-8t3g45kjnh6iq\"" Apr 22 17:37:57.241895 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.241859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56745797b9-6n5gh"] Apr 22 17:37:57.379160 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-metrics-server-audit-profiles\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379160 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-client-certs\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379341 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-audit-log\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379341 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bws5f\" (UniqueName: \"kubernetes.io/projected/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-kube-api-access-bws5f\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379341 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379306 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-tls\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379341 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.379486 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.379366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-client-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.424243 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.424214 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:57.424243 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.424246 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:57.429103 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.429084 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:57.479900 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.479831 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-metrics-server-audit-profiles\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.479900 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.479874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-client-certs\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480064 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-audit-log\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480115 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bws5f\" (UniqueName: \"kubernetes.io/projected/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-kube-api-access-bws5f\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480115 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-tls\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480196 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480196 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-client-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480681 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-audit-log\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.480917 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.481024 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.480940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-metrics-server-audit-profiles\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.483201 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.483155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-client-ca-bundle\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.483311 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.483263 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-tls\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.483378 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.483349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-secret-metrics-server-client-certs\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.488713 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.488689 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bws5f\" (UniqueName: \"kubernetes.io/projected/215aa1a6-12c0-4aca-a53d-4ef29b1d5c40-kube-api-access-bws5f\") pod \"metrics-server-56745797b9-6n5gh\" (UID: \"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40\") " pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.544769 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.544737 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:37:57.685115 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.685064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56745797b9-6n5gh"] Apr 22 17:37:57.688310 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:57.688282 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215aa1a6_12c0_4aca_a53d_4ef29b1d5c40.slice/crio-cd28d8376bb1cbf9bfaef9969feabf0ac1b82ace46415f510e76f7bd27decbf8 WatchSource:0}: Error finding container cd28d8376bb1cbf9bfaef9969feabf0ac1b82ace46415f510e76f7bd27decbf8: Status 404 returned error can't find the container with id cd28d8376bb1cbf9bfaef9969feabf0ac1b82ace46415f510e76f7bd27decbf8 Apr 22 17:37:57.705787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.705763 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:37:57.713237 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.712459 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" event={"ID":"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40","Type":"ContainerStarted","Data":"cd28d8376bb1cbf9bfaef9969feabf0ac1b82ace46415f510e76f7bd27decbf8"} Apr 22 17:37:57.719212 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:57.719192 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:37:58.126671 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.126637 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-57755bc657-fxlbv"] Apr 22 17:37:58.131154 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.131139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.133751 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.133730 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:37:58.134003 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.133976 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:37:58.134003 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.133989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:37:58.134233 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.134202 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jksvf\"" Apr 22 17:37:58.134305 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.134283 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:37:58.134364 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.134301 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:37:58.140334 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.140305 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:37:58.148115 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.148093 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-57755bc657-fxlbv"] Apr 22 17:37:58.288458 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288631 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447rv\" (UniqueName: \"kubernetes.io/projected/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-kube-api-access-447rv\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288631 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288544 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-serving-certs-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288631 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288635 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-federate-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288720 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-metrics-client-ca\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.288787 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.288767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.389841 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-447rv\" (UniqueName: \"kubernetes.io/projected/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-kube-api-access-447rv\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.389841 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-serving-certs-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.389841 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.390119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-federate-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.390119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.390119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.389958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-metrics-client-ca\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.390119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.390010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.390119 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.390068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.391259 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.390938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-serving-certs-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.391259 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.390938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-metrics-client-ca\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.391749 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.391700 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.393327 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.393277 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-telemeter-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.393469 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.393330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-federate-client-tls\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.394380 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.394354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.394492 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.394425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.399707 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.399683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-447rv\" (UniqueName: \"kubernetes.io/projected/6e24c657-c6c6-4873-bdbb-160d9dba6dc3-kube-api-access-447rv\") pod \"telemeter-client-57755bc657-fxlbv\" (UID: \"6e24c657-c6c6-4873-bdbb-160d9dba6dc3\") " pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:58.441518 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:58.441492 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" Apr 22 17:37:59.143350 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.143309 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:59.147656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.147630 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.150233 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.150208 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:37:59.150358 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.150252 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:37:59.150358 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.150269 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fhzzx\"" Apr 22 17:37:59.150358 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.150325 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:37:59.151190 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151169 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:37:59.151312 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151230 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:37:59.151373 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151329 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:37:59.151692 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151632 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:37:59.151825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151759 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:37:59.151825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.151777 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-hcu7uc3sc6h4\"" Apr 22 17:37:59.152028 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.152012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:37:59.152455 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.152421 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:37:59.154858 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.154833 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:37:59.156637 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.156615 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:37:59.159661 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.159640 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:59.299601 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m57tk\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299752 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299752 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299752 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299809 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299979 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299979 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299979 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.299979 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.299994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.300029 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.300054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.300086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300299 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.300152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.300299 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.300191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401475 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401578 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401578 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401539 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401707 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401730 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.401990 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.401969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.402313 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.402021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.402313 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.402062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.402313 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.402102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m57tk\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.402313 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.402285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.403072 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.402941 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.405498 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.405234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.407892 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.407587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.410485 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.408105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.410485 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.409282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.410485 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.409950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.411929 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.410939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.411929 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.411561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.411929 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.411836 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.412644 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.412476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.412644 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.412608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.413832 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.413793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.414387 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.414345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.416087 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.415921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m57tk\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.416419 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.416365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.419055 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.418884 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.420938 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.420915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config\") pod \"prometheus-k8s-0\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.461622 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.461589 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:59.608593 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.608483 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:59.612090 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:59.612055 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa65efb9_f709_4357_bfdd_bb82f1b9652e.slice/crio-f8d3c8806c750c68b1207c6c3e282b7a78d62ced0bd40edc7ca310ae11780aee WatchSource:0}: Error finding container f8d3c8806c750c68b1207c6c3e282b7a78d62ced0bd40edc7ca310ae11780aee: Status 404 returned error can't find the container with id f8d3c8806c750c68b1207c6c3e282b7a78d62ced0bd40edc7ca310ae11780aee Apr 22 17:37:59.683779 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.683719 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-57755bc657-fxlbv"] Apr 22 17:37:59.686766 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:37:59.686737 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e24c657_c6c6_4873_bdbb_160d9dba6dc3.slice/crio-4e0c07c891736d290309304146b37f786e9181310a5bfa0838d80101796fb64e WatchSource:0}: Error finding container 4e0c07c891736d290309304146b37f786e9181310a5bfa0838d80101796fb64e: Status 404 returned error can't find the container with id 4e0c07c891736d290309304146b37f786e9181310a5bfa0838d80101796fb64e Apr 22 17:37:59.719063 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.719031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" event={"ID":"215aa1a6-12c0-4aca-a53d-4ef29b1d5c40","Type":"ContainerStarted","Data":"b54140393bb54954f4b65c872949f9d45aac097f7af32c2bc3c42a8ce96f98f6"} Apr 22 17:37:59.721013 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.720985 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"e64cae18fc37bd08777a6fff02686cf6401ce943aa29e9cd9915dca6d8190d03"} Apr 22 17:37:59.721142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.721021 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"997e1b2b097e103a9cd23c19db5d22c633ef3adc5002dcebf046f80daebfcd07"} Apr 22 17:37:59.721142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.721037 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"9290f347a03730b9d583a9bb9588ca65436ad2d060b207ae57e4c5494fca8fbe"} Apr 22 17:37:59.722081 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.722055 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"f8d3c8806c750c68b1207c6c3e282b7a78d62ced0bd40edc7ca310ae11780aee"} Apr 22 17:37:59.723043 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.723025 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" event={"ID":"6e24c657-c6c6-4873-bdbb-160d9dba6dc3","Type":"ContainerStarted","Data":"4e0c07c891736d290309304146b37f786e9181310a5bfa0838d80101796fb64e"} Apr 22 17:37:59.737332 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:37:59.737297 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" podStartSLOduration=1.091262652 podStartE2EDuration="2.737286112s" podCreationTimestamp="2026-04-22 17:37:57 +0000 UTC" firstStartedPulling="2026-04-22 17:37:57.691150544 +0000 UTC m=+195.032149604" lastFinishedPulling="2026-04-22 17:37:59.337173994 +0000 UTC m=+196.678173064" observedRunningTime="2026-04-22 17:37:59.736703206 +0000 UTC m=+197.077702277" watchObservedRunningTime="2026-04-22 17:37:59.737286112 +0000 UTC m=+197.078285183" Apr 22 17:38:00.728053 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:00.727913 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} Apr 22 17:38:01.731656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.731622 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" exitCode=0 Apr 22 17:38:01.732077 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.731685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} Apr 22 17:38:01.733582 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.733558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" event={"ID":"6e24c657-c6c6-4873-bdbb-160d9dba6dc3","Type":"ContainerStarted","Data":"eb2da2769acc92c4aef630fbeab33b7d79337f43f6a7ba9580677cfa69c1215d"} Apr 22 17:38:01.733672 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.733594 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" event={"ID":"6e24c657-c6c6-4873-bdbb-160d9dba6dc3","Type":"ContainerStarted","Data":"b78b70fa4df6d91425bd3a379f5a663da412c72dd803ca2be29665c0ffd7961c"} Apr 22 17:38:01.733672 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.733608 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" event={"ID":"6e24c657-c6c6-4873-bdbb-160d9dba6dc3","Type":"ContainerStarted","Data":"8318e06bab31ef754dc7c0b1e2afc1ccf8933955da107a114ddc40372b5b11bb"} Apr 22 17:38:01.735963 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.735938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"20e25620e4c32dbc0c03c7b065ef7e06ce38365ff04cb6479922212d5ff52c0f"} Apr 22 17:38:01.736070 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.735970 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"f182742a3fb3cf20b6ca6ee31bdd5be82e6f0adfb1cfcc246c0c5a92f05aa4e9"} Apr 22 17:38:01.736070 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.735984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" event={"ID":"e3e8ff30-4f30-4a45-8ffc-69de18231068","Type":"ContainerStarted","Data":"4d049083eb9e4d167ba1d3a86d1610b649b0072a9da09dac9439bfaab6aac5b1"} Apr 22 17:38:01.736156 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.736144 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:38:01.785042 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.785003 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-57755bc657-fxlbv" podStartSLOduration=2.057135552 podStartE2EDuration="3.784992568s" podCreationTimestamp="2026-04-22 17:37:58 +0000 UTC" firstStartedPulling="2026-04-22 17:37:59.688665301 +0000 UTC m=+197.029664351" lastFinishedPulling="2026-04-22 17:38:01.416522308 +0000 UTC m=+198.757521367" observedRunningTime="2026-04-22 17:38:01.784077857 +0000 UTC m=+199.125076928" watchObservedRunningTime="2026-04-22 17:38:01.784992568 +0000 UTC m=+199.125991639" Apr 22 17:38:01.813613 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:01.813576 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" podStartSLOduration=2.366449763 podStartE2EDuration="6.813566101s" podCreationTimestamp="2026-04-22 17:37:55 +0000 UTC" firstStartedPulling="2026-04-22 17:37:56.378840534 +0000 UTC m=+193.719839586" lastFinishedPulling="2026-04-22 17:38:00.825956869 +0000 UTC m=+198.166955924" observedRunningTime="2026-04-22 17:38:01.812594078 +0000 UTC m=+199.153593148" watchObservedRunningTime="2026-04-22 17:38:01.813566101 +0000 UTC m=+199.154565211" Apr 22 17:38:04.190064 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:04.190030 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:38:05.750606 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:05.750583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} Apr 22 17:38:05.750880 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:05.750614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} Apr 22 17:38:05.750880 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:05.750624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} Apr 22 17:38:05.750880 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:05.750632 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} Apr 22 17:38:05.750880 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:05.750642 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} Apr 22 17:38:06.756862 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:06.756828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerStarted","Data":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} Apr 22 17:38:06.788834 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:06.788785 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.009142602 podStartE2EDuration="7.788772528s" podCreationTimestamp="2026-04-22 17:37:59 +0000 UTC" firstStartedPulling="2026-04-22 17:37:59.614658484 +0000 UTC m=+196.955657535" lastFinishedPulling="2026-04-22 17:38:05.394288404 +0000 UTC m=+202.735287461" observedRunningTime="2026-04-22 17:38:06.786229919 +0000 UTC m=+204.127229027" watchObservedRunningTime="2026-04-22 17:38:06.788772528 +0000 UTC m=+204.129771598" Apr 22 17:38:07.745915 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:07.745887 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-84777b654d-wqzfk" Apr 22 17:38:09.461847 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:09.461801 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:17.545071 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:17.545029 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:38:17.545480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:17.545097 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:38:23.738725 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:23.738664 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-758cdfbcf7-mklvx" podUID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" containerName="console" containerID="cri-o://6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8" gracePeriod=15 Apr 22 17:38:23.977758 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:23.977738 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758cdfbcf7-mklvx_9438de3a-0410-4e89-bafc-f5ed9bcb2c3f/console/0.log" Apr 22 17:38:23.977859 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:23.977807 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:38:24.013885 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.013829 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.013885 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.013859 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.014067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.013900 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.014067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.013931 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.014067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.013958 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.014067 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.014002 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjl2\" (UniqueName: \"kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2\") pod \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\" (UID: \"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f\") " Apr 22 17:38:24.014385 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.014354 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config" (OuterVolumeSpecName: "console-config") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:24.014535 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.014464 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:24.014610 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.014551 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:24.016301 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.016278 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:24.016416 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.016309 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:24.016416 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.016322 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2" (OuterVolumeSpecName: "kube-api-access-dhjl2") pod "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" (UID: "9438de3a-0410-4e89-bafc-f5ed9bcb2c3f"). InnerVolumeSpecName "kube-api-access-dhjl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:24.114978 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.114942 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-serving-cert\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.114978 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.114975 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.114978 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.114985 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-console-oauth-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.115141 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.114994 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-oauth-serving-cert\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.115141 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.115003 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-service-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.115141 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.115012 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhjl2\" (UniqueName: \"kubernetes.io/projected/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f-kube-api-access-dhjl2\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:24.809206 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809180 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758cdfbcf7-mklvx_9438de3a-0410-4e89-bafc-f5ed9bcb2c3f/console/0.log" Apr 22 17:38:24.809642 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809218 2570 generic.go:358] "Generic (PLEG): container finished" podID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" containerID="6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8" exitCode=2 Apr 22 17:38:24.809642 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809246 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758cdfbcf7-mklvx" event={"ID":"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f","Type":"ContainerDied","Data":"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8"} Apr 22 17:38:24.809642 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809279 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758cdfbcf7-mklvx" Apr 22 17:38:24.809642 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809300 2570 scope.go:117] "RemoveContainer" containerID="6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8" Apr 22 17:38:24.809642 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.809281 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758cdfbcf7-mklvx" event={"ID":"9438de3a-0410-4e89-bafc-f5ed9bcb2c3f","Type":"ContainerDied","Data":"8f843fd5618a593fbdc752d42445b03e1219bba8ad812ed56ce7144a752113c4"} Apr 22 17:38:24.818434 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.818415 2570 scope.go:117] "RemoveContainer" containerID="6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8" Apr 22 17:38:24.818704 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:38:24.818684 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8\": container with ID starting with 6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8 not found: ID does not exist" containerID="6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8" Apr 22 17:38:24.818768 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.818710 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8"} err="failed to get container status \"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8\": rpc error: code = NotFound desc = could not find container \"6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8\": container with ID starting with 6b0fa05fa0111d4d2212d4c0c465102faf218b8bd061a54d34a79f4a03032ce8 not found: ID does not exist" Apr 22 17:38:24.829796 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.829777 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:38:24.835567 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:24.835548 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-758cdfbcf7-mklvx"] Apr 22 17:38:25.167010 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:25.166922 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" path="/var/lib/kubelet/pods/9438de3a-0410-4e89-bafc-f5ed9bcb2c3f/volumes" Apr 22 17:38:29.209055 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.209001 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-85664766d-jq2bw" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerName="registry" containerID="cri-o://a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186" gracePeriod=30 Apr 22 17:38:29.450475 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.450451 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:38:29.558505 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558428 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558505 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558469 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfjj4\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558505 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558494 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558770 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558520 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558770 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558537 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558770 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558559 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558770 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558578 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.558770 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558613 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted\") pod \"4789d962-90d0-4f73-b359-b7df6a792bd5\" (UID: \"4789d962-90d0-4f73-b359-b7df6a792bd5\") " Apr 22 17:38:29.559013 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.558965 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:29.559110 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.559091 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-trusted-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.559517 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.559490 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:29.561238 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.561198 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:29.561337 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.561269 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:29.561337 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.561273 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4" (OuterVolumeSpecName: "kube-api-access-jfjj4") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "kube-api-access-jfjj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:29.561457 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.561431 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:29.561538 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.561515 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:29.567117 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.567090 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4789d962-90d0-4f73-b359-b7df6a792bd5" (UID: "4789d962-90d0-4f73-b359-b7df6a792bd5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:29.659809 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659781 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jfjj4\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-kube-api-access-jfjj4\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659809 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659807 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-bound-sa-token\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659976 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659819 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659976 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659827 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4789d962-90d0-4f73-b359-b7df6a792bd5-registry-certificates\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659976 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659836 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-installation-pull-secrets\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659976 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659845 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4789d962-90d0-4f73-b359-b7df6a792bd5-ca-trust-extracted\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.659976 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.659855 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4789d962-90d0-4f73-b359-b7df6a792bd5-image-registry-private-configuration\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:38:29.825283 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.825191 2570 generic.go:358] "Generic (PLEG): container finished" podID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerID="a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186" exitCode=0 Apr 22 17:38:29.825283 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.825261 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85664766d-jq2bw" Apr 22 17:38:29.825526 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.825284 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85664766d-jq2bw" event={"ID":"4789d962-90d0-4f73-b359-b7df6a792bd5","Type":"ContainerDied","Data":"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186"} Apr 22 17:38:29.825526 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.825329 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85664766d-jq2bw" event={"ID":"4789d962-90d0-4f73-b359-b7df6a792bd5","Type":"ContainerDied","Data":"e8b0c9c05835644b4175e7ac68c7a391a9f4a9601053471809e82e9dbed50309"} Apr 22 17:38:29.825526 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.825346 2570 scope.go:117] "RemoveContainer" containerID="a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186" Apr 22 17:38:29.833308 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.833294 2570 scope.go:117] "RemoveContainer" containerID="a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186" Apr 22 17:38:29.833537 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:38:29.833520 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186\": container with ID starting with a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186 not found: ID does not exist" containerID="a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186" Apr 22 17:38:29.833595 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.833545 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186"} err="failed to get container status \"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186\": rpc error: code = NotFound desc = could not find container \"a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186\": container with ID starting with a4ac77d5783afa0d87c978265f7f018e7086aba9e1270c858ebbf2a0d5c02186 not found: ID does not exist" Apr 22 17:38:29.845281 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.845258 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:38:29.849128 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:29.849109 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85664766d-jq2bw"] Apr 22 17:38:31.167331 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:31.167297 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" path="/var/lib/kubelet/pods/4789d962-90d0-4f73-b359-b7df6a792bd5/volumes" Apr 22 17:38:37.550991 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:37.550962 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:38:37.554809 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:37.554788 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56745797b9-6n5gh" Apr 22 17:38:55.068454 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.068382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:38:55.070816 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.070793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0145db4f-d1c7-42f4-8607-b305371c3756-metrics-certs\") pod \"network-metrics-daemon-srjdz\" (UID: \"0145db4f-d1c7-42f4-8607-b305371c3756\") " pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:38:55.366145 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.366065 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qs9jr\"" Apr 22 17:38:55.374271 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.374250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srjdz" Apr 22 17:38:55.507752 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.507729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srjdz"] Apr 22 17:38:55.510046 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:38:55.510019 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0145db4f_d1c7_42f4_8607_b305371c3756.slice/crio-4b14d7a767860ba22639188e8bed549c5cc8bffcb1af4523c0be4b5448176a73 WatchSource:0}: Error finding container 4b14d7a767860ba22639188e8bed549c5cc8bffcb1af4523c0be4b5448176a73: Status 404 returned error can't find the container with id 4b14d7a767860ba22639188e8bed549c5cc8bffcb1af4523c0be4b5448176a73 Apr 22 17:38:55.905444 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:55.905390 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srjdz" event={"ID":"0145db4f-d1c7-42f4-8607-b305371c3756","Type":"ContainerStarted","Data":"4b14d7a767860ba22639188e8bed549c5cc8bffcb1af4523c0be4b5448176a73"} Apr 22 17:38:56.910289 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:56.910201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srjdz" event={"ID":"0145db4f-d1c7-42f4-8607-b305371c3756","Type":"ContainerStarted","Data":"6b7646d44d156784927c9189979a1731c270a57f42dab6a195ef88341d05dd92"} Apr 22 17:38:56.910289 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:56.910236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srjdz" event={"ID":"0145db4f-d1c7-42f4-8607-b305371c3756","Type":"ContainerStarted","Data":"cf1f1fd4515d863c1834110e7fc4e0d9c47e33ef000e472f1af020edb156acc6"} Apr 22 17:38:56.929648 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:56.929600 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-srjdz" podStartSLOduration=252.956612837 podStartE2EDuration="4m13.929587169s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:38:55.511897533 +0000 UTC m=+252.852896588" lastFinishedPulling="2026-04-22 17:38:56.484871861 +0000 UTC m=+253.825870920" observedRunningTime="2026-04-22 17:38:56.926980157 +0000 UTC m=+254.267979227" watchObservedRunningTime="2026-04-22 17:38:56.929587169 +0000 UTC m=+254.270586239" Apr 22 17:38:59.461974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:59.461937 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:59.481239 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:59.481217 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:59.933846 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:38:59.933777 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:17.417830 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.417744 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:17.418392 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418341 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="prometheus" containerID="cri-o://9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" gracePeriod=600 Apr 22 17:39:17.418512 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418395 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" gracePeriod=600 Apr 22 17:39:17.418512 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418453 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy" containerID="cri-o://51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" gracePeriod=600 Apr 22 17:39:17.418512 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418472 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-web" containerID="cri-o://434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" gracePeriod=600 Apr 22 17:39:17.418512 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418462 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="thanos-sidecar" containerID="cri-o://acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" gracePeriod=600 Apr 22 17:39:17.418673 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.418492 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="config-reloader" containerID="cri-o://3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" gracePeriod=600 Apr 22 17:39:17.658474 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.658453 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:17.745703 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745677 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.745865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745721 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.745865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745750 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.745865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745784 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.745865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745813 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.745865 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745839 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745871 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745919 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745952 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.745979 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746013 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746052 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m57tk\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746077 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746106 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746584 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746135 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746584 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746167 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746584 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746182 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:17.746584 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746207 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746584 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746234 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets\") pod \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\" (UID: \"aa65efb9-f709-4357-bfdd-bb82f1b9652e\") " Apr 22 17:39:17.746810 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.746742 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.747295 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.747216 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:17.748239 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.747566 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:17.748239 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.747647 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:39:17.748239 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.748002 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:17.749351 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.749322 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:17.749518 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.749369 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.749656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.749614 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.750131 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.750103 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:17.750672 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.750650 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.750801 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.750772 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.751329 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751299 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.751448 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751375 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out" (OuterVolumeSpecName: "config-out") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:39:17.751522 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751475 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.751560 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751517 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config" (OuterVolumeSpecName: "config") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.751764 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751732 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.751992 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.751975 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk" (OuterVolumeSpecName: "kube-api-access-m57tk") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "kube-api-access-m57tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:17.759992 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.759973 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config" (OuterVolumeSpecName: "web-config") pod "aa65efb9-f709-4357-bfdd-bb82f1b9652e" (UID: "aa65efb9-f709-4357-bfdd-bb82f1b9652e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:17.847680 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847646 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-kube-rbac-proxy\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847680 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847679 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847680 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847689 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-db\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847698 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847706 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847717 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847726 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847736 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-metrics-client-certs\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847744 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-configmap-metrics-client-ca\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847753 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m57tk\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-kube-api-access-m57tk\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847762 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847770 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-secret-grpc-tls\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847781 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847789 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa65efb9-f709-4357-bfdd-bb82f1b9652e-config-out\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847797 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aa65efb9-f709-4357-bfdd-bb82f1b9652e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847805 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa65efb9-f709-4357-bfdd-bb82f1b9652e-tls-assets\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.847844 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.847813 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa65efb9-f709-4357-bfdd-bb82f1b9652e-web-config\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:39:17.973129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973099 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" exitCode=0 Apr 22 17:39:17.973129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973121 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" exitCode=0 Apr 22 17:39:17.973129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973127 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" exitCode=0 Apr 22 17:39:17.973129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973133 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" exitCode=0 Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973137 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" exitCode=0 Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973142 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" exitCode=0 Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973162 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973191 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973198 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973212 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973203 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} Apr 22 17:39:17.973353 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973341 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} Apr 22 17:39:17.973653 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973363 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} Apr 22 17:39:17.973653 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} Apr 22 17:39:17.973653 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.973394 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aa65efb9-f709-4357-bfdd-bb82f1b9652e","Type":"ContainerDied","Data":"f8d3c8806c750c68b1207c6c3e282b7a78d62ced0bd40edc7ca310ae11780aee"} Apr 22 17:39:17.983558 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.983538 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:17.990166 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.990150 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:17.996634 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.996612 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:17.997927 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:17.997910 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:18.003166 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.003144 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.004043 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.003989 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:18.009321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.009306 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.016061 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.016048 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.022118 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022102 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.022394 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.022366 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.022472 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022436 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.022472 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022462 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.022672 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.022652 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.022757 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022676 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.022757 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022692 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.022898 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.022880 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.022945 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022903 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.022945 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.022916 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.023132 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.023116 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.023183 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023134 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.023183 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023147 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.023357 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.023344 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.023422 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023360 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.023422 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023378 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.023601 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.023585 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.023643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023607 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.023643 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023622 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.023814 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:18.023797 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.023852 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023818 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.023852 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023828 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.024017 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.023999 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.024078 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024017 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.024230 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024208 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.024271 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024232 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.024450 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024431 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.024513 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024451 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.024667 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024650 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.024717 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024666 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.024866 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024850 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.024911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.024867 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.025055 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025038 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.025100 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025055 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.025265 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025243 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.025338 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025267 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.025455 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025437 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.025504 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025455 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.025659 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025643 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.025714 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025659 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.025849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025833 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.025900 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.025849 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.026074 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026057 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.026128 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026075 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.026284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026267 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.026326 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026285 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.026511 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026494 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.026566 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026511 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.026703 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026689 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.026703 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026703 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.026867 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026853 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.026867 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.026866 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.027036 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.027022 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.031591 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.027036 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.032750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.031804 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.032750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.031835 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.032911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.032772 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.032911 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.032796 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.033126 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.033092 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.033126 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.033116 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.033872 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.033850 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.033872 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.033872 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.034100 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034082 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.034154 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034101 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.034301 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034285 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.034364 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034301 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.034548 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034527 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.034627 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034549 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.034790 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034767 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.034849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034793 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.034958 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.034940 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:18.035022 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035000 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.035022 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035019 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.035252 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035233 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035254 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035259 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-thanos" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035271 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-thanos" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035282 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="config-reloader" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035288 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="config-reloader" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035297 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-web" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035302 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-web" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035308 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035312 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035317 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" containerName="console" Apr 22 17:39:18.035321 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035323 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" containerName="console" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035334 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="prometheus" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035341 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="prometheus" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035351 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="thanos-sidecar" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035357 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="thanos-sidecar" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035367 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerName="registry" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035375 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerName="registry" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035386 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="init-config-reloader" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035394 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="init-config-reloader" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035481 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9438de3a-0410-4e89-bafc-f5ed9bcb2c3f" containerName="console" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035475 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035507 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035489 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-web" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035566 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="thanos-sidecar" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035578 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4789d962-90d0-4f73-b359-b7df6a792bd5" containerName="registry" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035590 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035604 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="prometheus" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035614 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="kube-rbac-proxy-thanos" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035625 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" containerName="config-reloader" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035727 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.035825 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035752 2570 scope.go:117] "RemoveContainer" containerID="62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035936 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15"} err="failed to get container status \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": rpc error: code = NotFound desc = could not find container \"62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15\": container with ID starting with 62d5e9ca6a95806f9e5076efdecb2a8184a00edcf3ba9099cb6cf4429175db15 not found: ID does not exist" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.035950 2570 scope.go:117] "RemoveContainer" containerID="51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036109 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9"} err="failed to get container status \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": rpc error: code = NotFound desc = could not find container \"51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9\": container with ID starting with 51002f5290cf70dd00a3a9f73b06da1c720e8f9301ec045e71d0dab8a9152cd9 not found: ID does not exist" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036127 2570 scope.go:117] "RemoveContainer" containerID="434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036270 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4"} err="failed to get container status \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": rpc error: code = NotFound desc = could not find container \"434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4\": container with ID starting with 434930987a5208e93f4cee5ff88572934bfb114d12614166a8a089bf24a28bc4 not found: ID does not exist" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036289 2570 scope.go:117] "RemoveContainer" containerID="acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036523 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5"} err="failed to get container status \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": rpc error: code = NotFound desc = could not find container \"acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5\": container with ID starting with acdf879774517b917cb958f9528179a5ebeac5eb52cc9584d6b75ec085229ea5 not found: ID does not exist" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036537 2570 scope.go:117] "RemoveContainer" containerID="3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036709 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc"} err="failed to get container status \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": rpc error: code = NotFound desc = could not find container \"3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc\": container with ID starting with 3728ac2f8068af3376f9c65d9d176b799b4a791cbc23d6b6ce247aa7f33949fc not found: ID does not exist" Apr 22 17:39:18.036754 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036725 2570 scope.go:117] "RemoveContainer" containerID="9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239" Apr 22 17:39:18.037220 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036931 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239"} err="failed to get container status \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": rpc error: code = NotFound desc = could not find container \"9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239\": container with ID starting with 9dd838119fd2afaf07e52cf73c6979517a6295a567a18dce1c38887db5fcc239 not found: ID does not exist" Apr 22 17:39:18.037220 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.036946 2570 scope.go:117] "RemoveContainer" containerID="248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e" Apr 22 17:39:18.037220 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.037162 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e"} err="failed to get container status \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": rpc error: code = NotFound desc = could not find container \"248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e\": container with ID starting with 248eb34bb68750621c1b7d6e4dacb51af3f990dfdb9b03c673c03e5136a7919e not found: ID does not exist" Apr 22 17:39:18.041458 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.041441 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.044047 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:39:18.044148 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:39:18.044273 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044256 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:39:18.044365 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044351 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:39:18.044560 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-hcu7uc3sc6h4\"" Apr 22 17:39:18.044632 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fhzzx\"" Apr 22 17:39:18.044792 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044760 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:39:18.044886 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.044774 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:39:18.045627 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.045493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:39:18.046793 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.046470 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:39:18.046793 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.046697 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:39:18.047762 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.047615 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:39:18.048480 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.047944 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:18.048609 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048592 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:39:18.048879 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048859 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-web-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.048945 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049001 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048949 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049001 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049001 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.048991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049071 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049142 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049361 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049147 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049361 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049168 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049820 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049797 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2n5t\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-kube-api-access-r2n5t\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049899 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.049899 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.049878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.050026 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.050008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-config-out\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.050129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.050032 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.050129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.050061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.050776 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.050755 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:39:18.150639 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-config-out\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.150750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.150750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.150750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-web-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.150750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.150750 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150725 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.150951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151031 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151109 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2n5t\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-kube-api-access-r2n5t\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151285 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.151656 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.152153 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.151728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.153121 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.152839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.154367 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.154325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17064d57-7459-4283-858a-2b62af750ac3-config-out\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.155318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.155532 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.155548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.155735 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156284 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-web-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156627 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156376 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156627 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/17064d57-7459-4283-858a-2b62af750ac3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.156824 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.156807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.157305 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.157279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/17064d57-7459-4283-858a-2b62af750ac3-config\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.161532 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.161514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2n5t\" (UniqueName: \"kubernetes.io/projected/17064d57-7459-4283-858a-2b62af750ac3-kube-api-access-r2n5t\") pod \"prometheus-k8s-0\" (UID: \"17064d57-7459-4283-858a-2b62af750ac3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.353151 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.353077 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:18.478941 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.478910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:39:18.480229 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:39:18.480201 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17064d57_7459_4283_858a_2b62af750ac3.slice/crio-edc7892539cfa59215c9e3d1e577ed961a7c943159c00319292d2e44f4026e4a WatchSource:0}: Error finding container edc7892539cfa59215c9e3d1e577ed961a7c943159c00319292d2e44f4026e4a: Status 404 returned error can't find the container with id edc7892539cfa59215c9e3d1e577ed961a7c943159c00319292d2e44f4026e4a Apr 22 17:39:18.978200 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.978169 2570 generic.go:358] "Generic (PLEG): container finished" podID="17064d57-7459-4283-858a-2b62af750ac3" containerID="28cc7d03f18de5a975fea1aefa8c0e650c7dafa587fada1bfe586d7c413983bb" exitCode=0 Apr 22 17:39:18.978362 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.978222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerDied","Data":"28cc7d03f18de5a975fea1aefa8c0e650c7dafa587fada1bfe586d7c413983bb"} Apr 22 17:39:18.978362 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:18.978242 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"edc7892539cfa59215c9e3d1e577ed961a7c943159c00319292d2e44f4026e4a"} Apr 22 17:39:19.174108 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.169378 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa65efb9-f709-4357-bfdd-bb82f1b9652e" path="/var/lib/kubelet/pods/aa65efb9-f709-4357-bfdd-bb82f1b9652e/volumes" Apr 22 17:39:19.983474 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"6ccf228de850211cd98f3ed28ef31203034e93a2a2375ea3b4ee01908baae8f6"} Apr 22 17:39:19.983474 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983474 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"912f71b7b3fd912929cdc25422011d859eb182aeee026d951814573296c7b1a9"} Apr 22 17:39:19.983474 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"9b8780abe6af6ab822435a1f58ae1a5adfada14b5bd87c417364b0e2f4250fb1"} Apr 22 17:39:19.983887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983491 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"352f59c11f52a9d6075d5baaa920145044392c6ec0df8ae4318ebec1c0f86633"} Apr 22 17:39:19.983887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"d32f2193737244380dc423246c11f3bf54701cf93bdd41c4343f24f511de0367"} Apr 22 17:39:19.983887 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:19.983507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"17064d57-7459-4283-858a-2b62af750ac3","Type":"ContainerStarted","Data":"4d9fdd476ee51ec9d21b4625fd45293489858c2c62c7ef6372e678f196b639f6"} Apr 22 17:39:20.013984 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:20.013924 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.013904644 podStartE2EDuration="2.013904644s" podCreationTimestamp="2026-04-22 17:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:39:20.010381587 +0000 UTC m=+277.351380668" watchObservedRunningTime="2026-04-22 17:39:20.013904644 +0000 UTC m=+277.354903719" Apr 22 17:39:23.354107 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:23.354076 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:24.581797 ip-10-0-143-10 kubenswrapper[2570]: E0422 17:39:24.581752 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" podUID="d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae" Apr 22 17:39:24.998849 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:24.998824 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:39:27.630365 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.630332 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:39:27.632925 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.632901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fjn2s\" (UID: \"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:39:27.701892 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.701869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4bjzc\"" Apr 22 17:39:27.709974 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.709957 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" Apr 22 17:39:27.730969 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.730943 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:39:27.733335 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.733317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a68516-ea37-46c1-bb27-cb34ede968ac-cert\") pod \"ingress-canary-dvrrw\" (UID: \"d3a68516-ea37-46c1-bb27-cb34ede968ac\") " pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:39:27.830767 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.830745 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s"] Apr 22 17:39:27.833233 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:39:27.833207 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd20cafc2_7331_4bf9_ae5c_8d94d62ff5ae.slice/crio-743639e30cbdc9547bf0d616e0f9be9d3e67a94ad4d99dab02f3f39aa4e1ab50 WatchSource:0}: Error finding container 743639e30cbdc9547bf0d616e0f9be9d3e67a94ad4d99dab02f3f39aa4e1ab50: Status 404 returned error can't find the container with id 743639e30cbdc9547bf0d616e0f9be9d3e67a94ad4d99dab02f3f39aa4e1ab50 Apr 22 17:39:27.965989 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.965967 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dt854\"" Apr 22 17:39:27.973890 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:27.973874 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dvrrw" Apr 22 17:39:28.008202 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:28.008144 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" event={"ID":"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae","Type":"ContainerStarted","Data":"743639e30cbdc9547bf0d616e0f9be9d3e67a94ad4d99dab02f3f39aa4e1ab50"} Apr 22 17:39:28.090222 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:28.090193 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dvrrw"] Apr 22 17:39:28.093255 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:39:28.093228 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3a68516_ea37_46c1_bb27_cb34ede968ac.slice/crio-fab9f2daa618ec2e6c77540e2e8a27661d0bb27b3ab74898265e400eb9a698d5 WatchSource:0}: Error finding container fab9f2daa618ec2e6c77540e2e8a27661d0bb27b3ab74898265e400eb9a698d5: Status 404 returned error can't find the container with id fab9f2daa618ec2e6c77540e2e8a27661d0bb27b3ab74898265e400eb9a698d5 Apr 22 17:39:29.012487 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:29.012391 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dvrrw" event={"ID":"d3a68516-ea37-46c1-bb27-cb34ede968ac","Type":"ContainerStarted","Data":"fab9f2daa618ec2e6c77540e2e8a27661d0bb27b3ab74898265e400eb9a698d5"} Apr 22 17:39:29.013996 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:29.013967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" event={"ID":"d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae","Type":"ContainerStarted","Data":"d28c0547896370c1edc7e5c5835b3de4b624b10e8fd22eda80ca64b89dc738f1"} Apr 22 17:39:29.033300 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:29.033259 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fjn2s" podStartSLOduration=272.152169743 podStartE2EDuration="4m33.033243713s" podCreationTimestamp="2026-04-22 17:34:56 +0000 UTC" firstStartedPulling="2026-04-22 17:39:27.835072227 +0000 UTC m=+285.176071281" lastFinishedPulling="2026-04-22 17:39:28.716146203 +0000 UTC m=+286.057145251" observedRunningTime="2026-04-22 17:39:29.030721508 +0000 UTC m=+286.371720581" watchObservedRunningTime="2026-04-22 17:39:29.033243713 +0000 UTC m=+286.374242789" Apr 22 17:39:31.025627 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:31.025590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dvrrw" event={"ID":"d3a68516-ea37-46c1-bb27-cb34ede968ac","Type":"ContainerStarted","Data":"648fafeb55b6edcb10562ddbb6b7b5b4cf8b12f41175eb6de28930588d9d99b5"} Apr 22 17:39:31.042609 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:31.042561 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dvrrw" podStartSLOduration=251.979603692 podStartE2EDuration="4m14.042547378s" podCreationTimestamp="2026-04-22 17:35:17 +0000 UTC" firstStartedPulling="2026-04-22 17:39:28.095123147 +0000 UTC m=+285.436122196" lastFinishedPulling="2026-04-22 17:39:30.158066819 +0000 UTC m=+287.499065882" observedRunningTime="2026-04-22 17:39:31.040892998 +0000 UTC m=+288.381892069" watchObservedRunningTime="2026-04-22 17:39:31.042547378 +0000 UTC m=+288.383546449" Apr 22 17:39:43.052071 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:39:43.052038 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:40:18.354039 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:40:18.354000 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:40:18.369907 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:40:18.369883 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:40:19.181762 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:40:19.181731 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:43:11.263129 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.263092 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-s5vx7"] Apr 22 17:43:11.266362 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.266346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-s5vx7" Apr 22 17:43:11.268813 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.268790 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:43:11.269757 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.269736 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:43:11.269757 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.269746 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 17:43:11.269899 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.269746 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-nd6qt\"" Apr 22 17:43:11.274415 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.274377 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-s5vx7"] Apr 22 17:43:11.344768 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.344742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rdx\" (UniqueName: \"kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx\") pod \"s3-init-s5vx7\" (UID: \"53e78460-a453-408d-82b7-e41830e039f3\") " pod="kserve/s3-init-s5vx7" Apr 22 17:43:11.445487 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.445459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rdx\" (UniqueName: \"kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx\") pod \"s3-init-s5vx7\" (UID: \"53e78460-a453-408d-82b7-e41830e039f3\") " pod="kserve/s3-init-s5vx7" Apr 22 17:43:11.454266 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.454239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rdx\" (UniqueName: \"kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx\") pod \"s3-init-s5vx7\" (UID: \"53e78460-a453-408d-82b7-e41830e039f3\") " pod="kserve/s3-init-s5vx7" Apr 22 17:43:11.585599 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.585525 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-s5vx7" Apr 22 17:43:11.706585 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.706392 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-s5vx7"] Apr 22 17:43:11.708811 ip-10-0-143-10 kubenswrapper[2570]: W0422 17:43:11.708775 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53e78460_a453_408d_82b7_e41830e039f3.slice/crio-9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927 WatchSource:0}: Error finding container 9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927: Status 404 returned error can't find the container with id 9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927 Apr 22 17:43:11.710534 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:11.710516 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:43:12.641693 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:12.641639 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-s5vx7" event={"ID":"53e78460-a453-408d-82b7-e41830e039f3","Type":"ContainerStarted","Data":"9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927"} Apr 22 17:43:16.658151 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:16.658108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-s5vx7" event={"ID":"53e78460-a453-408d-82b7-e41830e039f3","Type":"ContainerStarted","Data":"931a49cf76ee97f332e3860bbdedea0111d7be53b2be175daf6de335be39dd16"} Apr 22 17:43:16.673709 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:16.673664 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-s5vx7" podStartSLOduration=1.290453824 podStartE2EDuration="5.673650576s" podCreationTimestamp="2026-04-22 17:43:11 +0000 UTC" firstStartedPulling="2026-04-22 17:43:11.710667166 +0000 UTC m=+509.051666215" lastFinishedPulling="2026-04-22 17:43:16.093863916 +0000 UTC m=+513.434862967" observedRunningTime="2026-04-22 17:43:16.673280375 +0000 UTC m=+514.014279447" watchObservedRunningTime="2026-04-22 17:43:16.673650576 +0000 UTC m=+514.014649647" Apr 22 17:43:19.668972 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:19.668941 2570 generic.go:358] "Generic (PLEG): container finished" podID="53e78460-a453-408d-82b7-e41830e039f3" containerID="931a49cf76ee97f332e3860bbdedea0111d7be53b2be175daf6de335be39dd16" exitCode=0 Apr 22 17:43:19.669337 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:19.668977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-s5vx7" event={"ID":"53e78460-a453-408d-82b7-e41830e039f3","Type":"ContainerDied","Data":"931a49cf76ee97f332e3860bbdedea0111d7be53b2be175daf6de335be39dd16"} Apr 22 17:43:20.797549 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:20.797529 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-s5vx7" Apr 22 17:43:20.927820 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:20.927744 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6rdx\" (UniqueName: \"kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx\") pod \"53e78460-a453-408d-82b7-e41830e039f3\" (UID: \"53e78460-a453-408d-82b7-e41830e039f3\") " Apr 22 17:43:20.930144 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:20.930120 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx" (OuterVolumeSpecName: "kube-api-access-q6rdx") pod "53e78460-a453-408d-82b7-e41830e039f3" (UID: "53e78460-a453-408d-82b7-e41830e039f3"). InnerVolumeSpecName "kube-api-access-q6rdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:21.029216 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:21.029183 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6rdx\" (UniqueName: \"kubernetes.io/projected/53e78460-a453-408d-82b7-e41830e039f3-kube-api-access-q6rdx\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 17:43:21.675468 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:21.675439 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-s5vx7" Apr 22 17:43:21.675468 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:21.675446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-s5vx7" event={"ID":"53e78460-a453-408d-82b7-e41830e039f3","Type":"ContainerDied","Data":"9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927"} Apr 22 17:43:21.675468 ip-10-0-143-10 kubenswrapper[2570]: I0422 17:43:21.675472 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9896b555e9de4eb63173e9871ef413f7dc8d4b8526719ac752ea665c384cc927" Apr 22 18:43:34.674353 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.674270 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rxphn/must-gather-ltnkx"] Apr 22 18:43:34.676853 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.674647 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53e78460-a453-408d-82b7-e41830e039f3" containerName="s3-init" Apr 22 18:43:34.676853 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.674661 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e78460-a453-408d-82b7-e41830e039f3" containerName="s3-init" Apr 22 18:43:34.676853 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.674725 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="53e78460-a453-408d-82b7-e41830e039f3" containerName="s3-init" Apr 22 18:43:34.677707 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.677694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.680587 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.680564 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rxphn\"/\"kube-root-ca.crt\"" Apr 22 18:43:34.681048 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.681033 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rxphn\"/\"openshift-service-ca.crt\"" Apr 22 18:43:34.706100 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.706078 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxphn/must-gather-ltnkx"] Apr 22 18:43:34.710813 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.710792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.710907 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.710867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9fz\" (UniqueName: \"kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.811940 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.811905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9fz\" (UniqueName: \"kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.812060 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.811974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.812242 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.812227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:34.821979 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:34.821961 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9fz\" (UniqueName: \"kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz\") pod \"must-gather-ltnkx\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:35.000981 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:35.000959 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:43:35.120880 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:35.120859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rxphn/must-gather-ltnkx"] Apr 22 18:43:35.123020 ip-10-0-143-10 kubenswrapper[2570]: W0422 18:43:35.122983 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba63180_0a6f_43b4_ad57_4b2d6be0ce3d.slice/crio-67258d72772764fbc3235035187cf8469b52c4f95b5719946960444aa26839d7 WatchSource:0}: Error finding container 67258d72772764fbc3235035187cf8469b52c4f95b5719946960444aa26839d7: Status 404 returned error can't find the container with id 67258d72772764fbc3235035187cf8469b52c4f95b5719946960444aa26839d7 Apr 22 18:43:35.124594 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:35.124577 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:36.049102 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:36.049032 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxphn/must-gather-ltnkx" event={"ID":"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d","Type":"ContainerStarted","Data":"67258d72772764fbc3235035187cf8469b52c4f95b5719946960444aa26839d7"} Apr 22 18:43:41.065608 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:41.065573 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxphn/must-gather-ltnkx" event={"ID":"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d","Type":"ContainerStarted","Data":"4c268d189fbbdbcdcda04c3db169fea0b5932e5d69e3d35982fcfd79664c2fa1"} Apr 22 18:43:41.065608 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:41.065613 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxphn/must-gather-ltnkx" event={"ID":"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d","Type":"ContainerStarted","Data":"45d5d4714d3f2be30539353a959a552ee95659a2b017edab69b60c2ec13eae3a"} Apr 22 18:43:41.082473 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:43:41.082426 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rxphn/must-gather-ltnkx" podStartSLOduration=1.914070507 podStartE2EDuration="7.082394252s" podCreationTimestamp="2026-04-22 18:43:34 +0000 UTC" firstStartedPulling="2026-04-22 18:43:35.124723963 +0000 UTC m=+4132.465723012" lastFinishedPulling="2026-04-22 18:43:40.293047699 +0000 UTC m=+4137.634046757" observedRunningTime="2026-04-22 18:43:41.081600101 +0000 UTC m=+4138.422599169" watchObservedRunningTime="2026-04-22 18:43:41.082394252 +0000 UTC m=+4138.423393323" Apr 22 18:44:05.144156 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:05.144119 2570 generic.go:358] "Generic (PLEG): container finished" podID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerID="45d5d4714d3f2be30539353a959a552ee95659a2b017edab69b60c2ec13eae3a" exitCode=0 Apr 22 18:44:05.144567 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:05.144193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rxphn/must-gather-ltnkx" event={"ID":"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d","Type":"ContainerDied","Data":"45d5d4714d3f2be30539353a959a552ee95659a2b017edab69b60c2ec13eae3a"} Apr 22 18:44:05.144567 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:05.144519 2570 scope.go:117] "RemoveContainer" containerID="45d5d4714d3f2be30539353a959a552ee95659a2b017edab69b60c2ec13eae3a" Apr 22 18:44:05.688555 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:05.688519 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxphn_must-gather-ltnkx_3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d/gather/0.log" Apr 22 18:44:06.212029 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.212000 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cfxdp/must-gather-5ddhx"] Apr 22 18:44:06.215318 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.215301 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.218030 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.218006 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cfxdp\"/\"openshift-service-ca.crt\"" Apr 22 18:44:06.218150 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.218061 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cfxdp\"/\"kube-root-ca.crt\"" Apr 22 18:44:06.219031 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.219010 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cfxdp\"/\"default-dockercfg-m8sx6\"" Apr 22 18:44:06.227015 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.226998 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/must-gather-5ddhx"] Apr 22 18:44:06.286806 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.286779 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjcz\" (UniqueName: \"kubernetes.io/projected/6d624d97-489e-4eeb-8f40-995c08b33ca5-kube-api-access-wjjcz\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.286901 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.286820 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d624d97-489e-4eeb-8f40-995c08b33ca5-must-gather-output\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.387777 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.387737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjcz\" (UniqueName: \"kubernetes.io/projected/6d624d97-489e-4eeb-8f40-995c08b33ca5-kube-api-access-wjjcz\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.387948 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.387795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d624d97-489e-4eeb-8f40-995c08b33ca5-must-gather-output\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.388159 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.388139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6d624d97-489e-4eeb-8f40-995c08b33ca5-must-gather-output\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.397943 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.397921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjcz\" (UniqueName: \"kubernetes.io/projected/6d624d97-489e-4eeb-8f40-995c08b33ca5-kube-api-access-wjjcz\") pod \"must-gather-5ddhx\" (UID: \"6d624d97-489e-4eeb-8f40-995c08b33ca5\") " pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.524582 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.524513 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" Apr 22 18:44:06.643981 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:06.643949 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/must-gather-5ddhx"] Apr 22 18:44:06.648136 ip-10-0-143-10 kubenswrapper[2570]: W0422 18:44:06.648108 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d624d97_489e_4eeb_8f40_995c08b33ca5.slice/crio-67ef583beb02b8710f1e8e9a4659dcedfa95f67163bc712d35931e1a139c54a6 WatchSource:0}: Error finding container 67ef583beb02b8710f1e8e9a4659dcedfa95f67163bc712d35931e1a139c54a6: Status 404 returned error can't find the container with id 67ef583beb02b8710f1e8e9a4659dcedfa95f67163bc712d35931e1a139c54a6 Apr 22 18:44:07.149941 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:07.149910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" event={"ID":"6d624d97-489e-4eeb-8f40-995c08b33ca5","Type":"ContainerStarted","Data":"67ef583beb02b8710f1e8e9a4659dcedfa95f67163bc712d35931e1a139c54a6"} Apr 22 18:44:08.155385 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:08.155351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" event={"ID":"6d624d97-489e-4eeb-8f40-995c08b33ca5","Type":"ContainerStarted","Data":"ef403b64793aadac875080dcaf772b6cf81ed9f842a4aae463f37469b1a71e84"} Apr 22 18:44:08.155913 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:08.155889 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" event={"ID":"6d624d97-489e-4eeb-8f40-995c08b33ca5","Type":"ContainerStarted","Data":"4939c0ad5060b1f563699393e276c5cdc391e534b951517dd8dc6f3111bd271a"} Apr 22 18:44:08.175542 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:08.175463 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cfxdp/must-gather-5ddhx" podStartSLOduration=1.163072325 podStartE2EDuration="2.175449577s" podCreationTimestamp="2026-04-22 18:44:06 +0000 UTC" firstStartedPulling="2026-04-22 18:44:06.652478644 +0000 UTC m=+4163.993477697" lastFinishedPulling="2026-04-22 18:44:07.6648559 +0000 UTC m=+4165.005854949" observedRunningTime="2026-04-22 18:44:08.174758117 +0000 UTC m=+4165.515757198" watchObservedRunningTime="2026-04-22 18:44:08.175449577 +0000 UTC m=+4165.516448648" Apr 22 18:44:09.192880 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:09.192834 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vl6r9_dd494ca2-41f2-43cc-bffc-b65701eba67a/global-pull-secret-syncer/0.log" Apr 22 18:44:09.324826 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:09.324791 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tdzw6_96482cfc-a3ad-4187-a8c4-419fcd27a81f/konnectivity-agent/0.log" Apr 22 18:44:09.497873 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:09.497827 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-10.ec2.internal_f35ec5c62ed7016564c2db426231f954/haproxy/0.log" Apr 22 18:44:11.037917 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.037872 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rxphn/must-gather-ltnkx"] Apr 22 18:44:11.038512 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.038172 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rxphn/must-gather-ltnkx" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="copy" containerID="cri-o://4c268d189fbbdbcdcda04c3db169fea0b5932e5d69e3d35982fcfd79664c2fa1" gracePeriod=2 Apr 22 18:44:11.042448 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.041735 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rxphn/must-gather-ltnkx"] Apr 22 18:44:11.044847 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.044814 2570 status_manager.go:895] "Failed to get status for pod" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" pod="openshift-must-gather-rxphn/must-gather-ltnkx" err="pods \"must-gather-ltnkx\" is forbidden: User \"system:node:ip-10-0-143-10.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rxphn\": no relationship found between node 'ip-10-0-143-10.ec2.internal' and this object" Apr 22 18:44:11.176863 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.176842 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxphn_must-gather-ltnkx_3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d/copy/0.log" Apr 22 18:44:11.177471 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.177444 2570 generic.go:358] "Generic (PLEG): container finished" podID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerID="4c268d189fbbdbcdcda04c3db169fea0b5932e5d69e3d35982fcfd79664c2fa1" exitCode=143 Apr 22 18:44:11.416769 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.416683 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxphn_must-gather-ltnkx_3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d/copy/0.log" Apr 22 18:44:11.417349 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.417232 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:44:11.533171 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.531359 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9fz\" (UniqueName: \"kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz\") pod \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " Apr 22 18:44:11.533171 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.531433 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output\") pod \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\" (UID: \"3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d\") " Apr 22 18:44:11.533171 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.532873 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" (UID: "3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:44:11.545514 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.545458 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz" (OuterVolumeSpecName: "kube-api-access-7c9fz") pod "3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" (UID: "3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d"). InnerVolumeSpecName "kube-api-access-7c9fz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:11.633157 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.633029 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7c9fz\" (UniqueName: \"kubernetes.io/projected/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-kube-api-access-7c9fz\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:11.633157 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:11.633070 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d-must-gather-output\") on node \"ip-10-0-143-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:12.184226 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:12.184145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rxphn_must-gather-ltnkx_3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d/copy/0.log" Apr 22 18:44:12.190432 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:12.185260 2570 scope.go:117] "RemoveContainer" containerID="4c268d189fbbdbcdcda04c3db169fea0b5932e5d69e3d35982fcfd79664c2fa1" Apr 22 18:44:12.190432 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:12.185436 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rxphn/must-gather-ltnkx" Apr 22 18:44:12.210752 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:12.210728 2570 scope.go:117] "RemoveContainer" containerID="45d5d4714d3f2be30539353a959a552ee95659a2b017edab69b60c2ec13eae3a" Apr 22 18:44:13.167450 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.167417 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" path="/var/lib/kubelet/pods/3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d/volumes" Apr 22 18:44:13.271564 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.271490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-p4547_bce37b62-bce4-4fb4-8842-12a172ca9af4/cluster-monitoring-operator/0.log" Apr 22 18:44:13.366316 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.366286 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56745797b9-6n5gh_215aa1a6-12c0-4aca-a53d-4ef29b1d5c40/metrics-server/0.log" Apr 22 18:44:13.509800 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.509773 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdb9m_6825a724-cd96-48d9-9d88-2e764cd2c29b/node-exporter/0.log" Apr 22 18:44:13.549228 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.549199 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdb9m_6825a724-cd96-48d9-9d88-2e764cd2c29b/kube-rbac-proxy/0.log" Apr 22 18:44:13.580456 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.580433 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdb9m_6825a724-cd96-48d9-9d88-2e764cd2c29b/init-textfile/0.log" Apr 22 18:44:13.712886 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.712847 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8tbx4_9aea887a-f1ce-4e5a-acf6-9d80afe812bb/kube-rbac-proxy-main/0.log" Apr 22 18:44:13.741304 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.741272 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8tbx4_9aea887a-f1ce-4e5a-acf6-9d80afe812bb/kube-rbac-proxy-self/0.log" Apr 22 18:44:13.765499 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.765391 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8tbx4_9aea887a-f1ce-4e5a-acf6-9d80afe812bb/openshift-state-metrics/0.log" Apr 22 18:44:13.818141 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.818106 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/prometheus/0.log" Apr 22 18:44:13.840246 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.840222 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/config-reloader/0.log" Apr 22 18:44:13.868646 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.868595 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/thanos-sidecar/0.log" Apr 22 18:44:13.896807 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.896776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/kube-rbac-proxy-web/0.log" Apr 22 18:44:13.921632 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.921601 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/kube-rbac-proxy/0.log" Apr 22 18:44:13.966866 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.966827 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/kube-rbac-proxy-thanos/0.log" Apr 22 18:44:13.998924 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:13.998893 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_17064d57-7459-4283-858a-2b62af750ac3/init-config-reloader/0.log" Apr 22 18:44:14.126726 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.126695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-f756w_b8b995d5-3b09-42a1-976e-755867230655/prometheus-operator-admission-webhook/0.log" Apr 22 18:44:14.159420 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.159365 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57755bc657-fxlbv_6e24c657-c6c6-4873-bdbb-160d9dba6dc3/telemeter-client/0.log" Apr 22 18:44:14.191067 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.190883 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57755bc657-fxlbv_6e24c657-c6c6-4873-bdbb-160d9dba6dc3/reload/0.log" Apr 22 18:44:14.226275 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.226243 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-57755bc657-fxlbv_6e24c657-c6c6-4873-bdbb-160d9dba6dc3/kube-rbac-proxy/0.log" Apr 22 18:44:14.270075 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.270042 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/thanos-query/0.log" Apr 22 18:44:14.307108 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.307052 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/kube-rbac-proxy-web/0.log" Apr 22 18:44:14.329075 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.329047 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/kube-rbac-proxy/0.log" Apr 22 18:44:14.353703 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.353670 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/prom-label-proxy/0.log" Apr 22 18:44:14.378045 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.377967 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/kube-rbac-proxy-rules/0.log" Apr 22 18:44:14.401283 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:14.401247 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-84777b654d-wqzfk_e3e8ff30-4f30-4a45-8ffc-69de18231068/kube-rbac-proxy-metrics/0.log" Apr 22 18:44:15.394835 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:15.394806 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fjn2s_d20cafc2-7331-4bf9-ae5c-8d94d62ff5ae/networking-console-plugin/0.log" Apr 22 18:44:16.161137 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161096 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx"] Apr 22 18:44:16.161527 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161512 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="gather" Apr 22 18:44:16.161592 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161530 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="gather" Apr 22 18:44:16.161592 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161552 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="copy" Apr 22 18:44:16.161592 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161558 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="copy" Apr 22 18:44:16.161696 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161622 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="gather" Apr 22 18:44:16.161696 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.161632 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ba63180-0a6f-43b4-ad57-4b2d6be0ce3d" containerName="copy" Apr 22 18:44:16.165962 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.165942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.170791 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.170768 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx"] Apr 22 18:44:16.277895 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.277849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-lib-modules\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.278050 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.277899 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h9mm\" (UniqueName: \"kubernetes.io/projected/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-kube-api-access-7h9mm\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.278050 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.277981 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-proc\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.278050 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.278019 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-podres\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.278050 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.278048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-sys\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379085 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h9mm\" (UniqueName: \"kubernetes.io/projected/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-kube-api-access-7h9mm\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379245 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-proc\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379245 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-podres\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379245 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-sys\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379245 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-lib-modules\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379452 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379263 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-proc\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379452 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379308 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-podres\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379452 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-sys\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.379452 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.379366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-lib-modules\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.387790 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.387768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h9mm\" (UniqueName: \"kubernetes.io/projected/8ae3bcb0-02ec-4c47-ac37-9e586bc07d97-kube-api-access-7h9mm\") pod \"perf-node-gather-daemonset-96glx\" (UID: \"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.476829 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.476802 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:16.618552 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:16.618520 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx"] Apr 22 18:44:17.207347 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.207315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" event={"ID":"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97","Type":"ContainerStarted","Data":"7cfe680955eaa9904e7a79ad1c4783105e50e925d2137a340104e041c7550fb4"} Apr 22 18:44:17.207347 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.207351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" event={"ID":"8ae3bcb0-02ec-4c47-ac37-9e586bc07d97","Type":"ContainerStarted","Data":"6b4a7a90546f3b18609b97a50df7bf7b0bc2dd86c9cdf058893abef71bec10eb"} Apr 22 18:44:17.207564 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.207442 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:17.390129 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.390097 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hknm7_edd141af-4f14-4224-b057-0cd35252fcd8/dns/0.log" Apr 22 18:44:17.431098 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.431072 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hknm7_edd141af-4f14-4224-b057-0cd35252fcd8/kube-rbac-proxy/0.log" Apr 22 18:44:17.622658 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:17.622567 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8fsg9_83519001-bdda-4c9d-ab90-db32b4638392/dns-node-resolver/0.log" Apr 22 18:44:18.365152 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:18.365114 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zff9d_a345adc2-0a7b-481f-ad9d-9acdfefd72d1/node-ca/0.log" Apr 22 18:44:19.609829 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:19.609799 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dvrrw_d3a68516-ea37-46c1-bb27-cb34ede968ac/serve-healthcheck-canary/0.log" Apr 22 18:44:20.152422 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:20.152363 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rr4td_20f22498-ba1e-4fad-8fc7-110f430def54/kube-rbac-proxy/0.log" Apr 22 18:44:20.174128 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:20.174101 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rr4td_20f22498-ba1e-4fad-8fc7-110f430def54/exporter/0.log" Apr 22 18:44:20.211097 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:20.211077 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rr4td_20f22498-ba1e-4fad-8fc7-110f430def54/extractor/0.log" Apr 22 18:44:22.924513 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:22.924487 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-s5vx7_53e78460-a453-408d-82b7-e41830e039f3/s3-init/0.log" Apr 22 18:44:23.220157 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:23.220128 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" Apr 22 18:44:23.249542 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:23.249500 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-96glx" podStartSLOduration=7.249486506 podStartE2EDuration="7.249486506s" podCreationTimestamp="2026-04-22 18:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:17.229462682 +0000 UTC m=+4174.570461754" watchObservedRunningTime="2026-04-22 18:44:23.249486506 +0000 UTC m=+4180.590485576" Apr 22 18:44:27.026870 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:27.026839 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ts42j_444ac662-374e-4a35-971b-15f8e3f58a16/migrator/0.log" Apr 22 18:44:27.053187 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:27.053158 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ts42j_444ac662-374e-4a35-971b-15f8e3f58a16/graceful-termination/0.log" Apr 22 18:44:28.453913 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.453880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8vkqt_5b37faae-6be3-4973-8048-7a21fab3256d/kube-multus/0.log" Apr 22 18:44:28.499371 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.499345 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/kube-multus-additional-cni-plugins/0.log" Apr 22 18:44:28.541179 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.541148 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/egress-router-binary-copy/0.log" Apr 22 18:44:28.578325 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.578301 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/cni-plugins/0.log" Apr 22 18:44:28.634119 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.634095 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/bond-cni-plugin/0.log" Apr 22 18:44:28.679930 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.679904 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/routeoverride-cni/0.log" Apr 22 18:44:28.724031 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.724010 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/whereabouts-cni-bincopy/0.log" Apr 22 18:44:28.761413 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:28.761377 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2z7l2_7a77318f-12ee-48f6-8626-11e2875f970b/whereabouts-cni/0.log" Apr 22 18:44:29.564973 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:29.564947 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-srjdz_0145db4f-d1c7-42f4-8607-b305371c3756/network-metrics-daemon/0.log" Apr 22 18:44:29.604783 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:29.604751 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-srjdz_0145db4f-d1c7-42f4-8607-b305371c3756/kube-rbac-proxy/0.log" Apr 22 18:44:31.598824 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:31.598794 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/ovn-controller/0.log" Apr 22 18:44:31.718883 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:31.718845 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/ovn-acl-logging/0.log" Apr 22 18:44:31.838388 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:31.838358 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/kube-rbac-proxy-node/0.log" Apr 22 18:44:31.904833 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:31.904756 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:44:31.951562 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:31.951535 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/northd/0.log" Apr 22 18:44:32.014049 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:32.014016 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/nbdb/0.log" Apr 22 18:44:32.087193 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:32.087153 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/sbdb/0.log" Apr 22 18:44:32.257479 ip-10-0-143-10 kubenswrapper[2570]: I0422 18:44:32.257451 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tpwrl_acc3c714-ca80-45fe-a1b0-14e012c3d912/ovnkube-controller/0.log"