Apr 16 16:00:18.772551 ip-10-0-136-220 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:00:18.772561 ip-10-0-136-220 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:00:18.772568 ip-10-0-136-220 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:00:18.772784 ip-10-0-136-220 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:00:28.777259 ip-10-0-136-220 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:00:28.777276 ip-10-0-136-220 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b7bf31fccb9d43ce956b719e5dd23323 -- Apr 16 16:03:00.421008 ip-10-0-136-220 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:03:00.964274 ip-10-0-136-220 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:00.964274 ip-10-0-136-220 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:03:00.964274 ip-10-0-136-220 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:00.964274 ip-10-0-136-220 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:03:00.964274 ip-10-0-136-220 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:03:00.965323 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.965223 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:03:00.967687 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967671 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:00.967687 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967686 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967691 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967695 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967698 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967701 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967704 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967706 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967709 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967711 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967714 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967717 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967719 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967722 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967725 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967727 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967730 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967732 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967735 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967737 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967746 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:00.967752 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967749 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967752 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967755 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967759 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967762 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967765 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967767 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967770 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967772 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967775 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967778 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967782 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967785 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967788 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967791 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967794 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967797 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967799 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967802 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:00.968247 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967805 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967807 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967809 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967812 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967815 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967817 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967820 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967822 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967825 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967827 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967830 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967833 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967835 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967837 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967840 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967844 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967847 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967850 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967852 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967855 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:00.968780 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967857 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967861 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967863 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967866 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967868 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967871 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967874 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967876 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967878 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967881 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967885 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967888 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967892 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967896 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967899 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967902 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967905 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967908 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967910 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967913 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:00.969264 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967916 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967919 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967923 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967925 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967928 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.967931 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968331 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968336 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968339 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968342 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968345 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968347 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968350 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968353 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968356 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968358 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968361 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968363 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968366 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968369 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:00.969811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968372 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968374 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968377 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968381 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968383 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968386 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968388 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968391 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968395 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968398 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968402 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968406 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968409 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968411 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968415 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968418 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968421 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968423 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968427 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:00.970287 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968429 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968432 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968435 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968437 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968440 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968442 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968445 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968447 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968450 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968453 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968455 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968458 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968461 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968464 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968466 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968469 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968472 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968475 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968478 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968480 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:00.970811 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968483 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968485 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968488 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968490 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968493 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968495 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968498 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968500 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968503 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968506 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968509 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968512 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968515 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968518 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968520 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968523 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968526 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968528 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968531 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968533 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:00.971331 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968536 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968538 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968541 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968544 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968546 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968549 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968551 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968554 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968556 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968559 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968562 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968564 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.968567 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969826 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969841 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969850 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969854 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969860 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969864 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969869 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969874 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:03:00.971909 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969877 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969881 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969884 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969888 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969891 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969894 2566 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969898 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969901 2566 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969903 2566 flags.go:64] FLAG: --cloud-config="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969906 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969909 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969913 2566 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969916 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969919 2566 flags.go:64] FLAG: --config-dir="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969922 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969925 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969930 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969934 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969938 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969941 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969944 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969948 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969951 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969954 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969957 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:03:00.972483 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969961 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969965 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969968 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969971 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969974 2566 flags.go:64] FLAG: --enable-server="true" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969977 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969982 2566 flags.go:64] FLAG: --event-burst="100" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969985 2566 flags.go:64] FLAG: --event-qps="50" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969988 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969992 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969995 2566 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.969999 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970002 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970005 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970008 2566 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970011 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970014 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970017 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970020 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970023 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970026 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970029 2566 flags.go:64] FLAG: --feature-gates="" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970033 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970036 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970039 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:03:00.973175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970043 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970046 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970049 2566 flags.go:64] FLAG: --help="false" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970052 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-136-220.ec2.internal" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970055 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970059 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970062 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970065 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970068 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970072 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970075 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970078 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970080 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970083 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970086 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970089 2566 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970092 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970095 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970098 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970101 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970104 2566 flags.go:64] FLAG: --lock-file="" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970107 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970110 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970113 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:03:00.973868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970119 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970122 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970125 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970128 2566 flags.go:64] FLAG: --logging-format="text" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970131 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970134 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970137 2566 flags.go:64] FLAG: --manifest-url="" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970140 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970144 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970148 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970153 2566 flags.go:64] FLAG: --max-pods="110" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970156 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970159 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970162 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970166 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970168 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970172 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970175 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970182 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970186 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970190 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970193 2566 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970196 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:03:00.974671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970202 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970205 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970208 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970211 2566 flags.go:64] FLAG: --port="10250" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970215 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970217 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a62a60879fa053ac" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970221 2566 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970224 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970227 2566 flags.go:64] FLAG: --register-node="true" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970230 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970232 2566 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970236 2566 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970239 2566 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970242 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970245 2566 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970249 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970252 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970255 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970258 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970265 2566 flags.go:64] FLAG: --runonce="false" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970268 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970271 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970274 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970277 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970281 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970284 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:03:00.975292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970287 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970291 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970294 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970297 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970301 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970304 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970308 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970311 2566 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970314 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970320 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970323 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970326 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970331 2566 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970334 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970337 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970339 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970343 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970346 2566 flags.go:64] FLAG: --v="2" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970350 2566 flags.go:64] FLAG: --version="false" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970354 2566 flags.go:64] FLAG: --vmodule="" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970359 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970362 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970468 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970471 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:00.975974 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970474 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970477 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970480 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970483 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970485 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970488 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970491 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970493 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970498 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970502 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970505 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970508 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970511 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970514 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970517 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970520 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970523 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970526 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970529 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:00.976605 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970532 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970534 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970537 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970540 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970543 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970546 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970548 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970551 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970554 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970557 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970560 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970562 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970568 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970571 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970574 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970576 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970579 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970582 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970585 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970587 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:00.977108 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970590 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970593 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970595 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970598 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970600 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970603 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970605 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970608 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970611 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970628 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970633 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970637 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970641 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970644 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970647 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970649 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970653 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970656 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970659 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:00.977629 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970661 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970664 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970666 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970669 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970671 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970675 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970678 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970682 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970686 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970688 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970693 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970696 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970701 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970703 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970706 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970708 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970711 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970714 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970716 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:00.978109 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970719 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970721 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970724 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970727 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970729 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970732 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.970735 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:00.978595 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.970740 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:00.979281 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.979258 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:03:00.979316 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.979282 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:03:00.979347 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979335 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:00.979347 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979341 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:00.979347 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979344 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:00.979347 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979348 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979352 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979357 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979362 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979365 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979368 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979371 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979374 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979377 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979380 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979383 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979386 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979388 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979391 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979393 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979396 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979400 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979402 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979405 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:00.979457 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979408 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979410 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979413 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979416 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979419 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979422 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979424 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979427 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979430 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979433 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979436 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979439 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979441 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979444 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979446 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979450 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979453 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979455 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979458 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979461 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:00.979936 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979464 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979467 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979470 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979473 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979475 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979478 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979481 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979483 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979486 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979489 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979491 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979494 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979496 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979499 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979501 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979504 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979507 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979526 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979530 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979533 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:00.980419 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979535 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979538 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979541 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979544 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979547 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979549 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979552 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979555 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979558 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979560 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979564 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979569 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979572 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979574 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979577 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979579 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979582 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979584 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979587 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:00.980975 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979591 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979593 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979596 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979599 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979601 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.979607 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979718 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979723 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979727 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979730 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979733 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979736 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979739 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979742 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979745 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:03:00.981446 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979748 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979752 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979754 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979758 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979760 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979764 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979767 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979770 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979772 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979775 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979777 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979780 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979783 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979786 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979788 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979791 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979794 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979797 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979799 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979803 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:03:00.981842 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979805 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979808 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979811 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979813 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979817 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979820 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979823 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979826 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979829 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979833 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979835 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979838 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979841 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979843 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979846 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979850 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979852 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979855 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979857 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979860 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:03:00.982323 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979863 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979865 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979868 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979870 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979873 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979876 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979879 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979881 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979884 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979887 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979889 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979892 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979895 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979897 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979900 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979902 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979905 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979908 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979910 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:03:00.982872 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979913 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979917 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979920 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979923 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979926 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979928 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979931 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979934 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979937 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979939 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979942 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979945 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979947 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979950 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979953 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979955 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979958 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:03:00.983318 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:00.979960 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:03:00.983743 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.979966 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:03:00.983743 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.980749 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:03:00.983743 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.982916 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:03:00.983991 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.983979 2566 server.go:1019] "Starting client certificate rotation" Apr 16 16:03:00.984097 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.984080 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:03:00.985017 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:00.985005 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:03:01.014787 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.014763 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:03:01.020690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.020659 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:03:01.038967 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.038939 2566 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:03:01.045416 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.045391 2566 log.go:25] "Validated CRI v1 image API" Apr 16 16:03:01.046145 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.046126 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:03:01.047441 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.047420 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:03:01.051103 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.051076 2566 fs.go:135] Filesystem UUIDs: map[2550f7de-9a32-47fa-beea-2552fc3a27ec:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f551606b-eae9-49de-b1e2-2435915f7fc1:/dev/nvme0n1p4] Apr 16 16:03:01.051159 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.051103 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:03:01.057707 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.057565 2566 manager.go:217] Machine: {Timestamp:2026-04-16 16:03:01.055435414 +0000 UTC m=+0.495963866 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3091597 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec275c845e90085931ad9526b98a670e SystemUUID:ec275c84-5e90-0859-31ad-9526b98a670e BootID:b7bf31fc-cb9d-43ce-956b-719e5dd23323 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:48:5d:de:e0:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:48:5d:de:e0:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:85:60:c5:02:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:03:01.057707 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.057700 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:03:01.057828 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.057787 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:03:01.058224 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.058197 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:03:01.058384 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.058225 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-220.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:03:01.058430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.058394 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:03:01.058430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.058403 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:03:01.058430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.058416 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:03:01.059219 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.059208 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:03:01.060647 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.060635 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:03:01.060759 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.060751 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:03:01.061950 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.061934 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kcz6x" Apr 16 16:03:01.063964 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.063951 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:03:01.064808 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.064796 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:03:01.064847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.064822 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:03:01.064847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.064832 2566 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:03:01.064847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.064844 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:03:01.066967 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.066949 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:03:01.067054 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.066975 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:03:01.068965 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.068944 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kcz6x" Apr 16 16:03:01.070841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.070824 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:03:01.073022 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.073008 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:03:01.074573 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074560 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074577 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074583 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074589 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074595 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074601 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074607 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074626 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074635 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:03:01.074645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074644 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:03:01.074906 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074654 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:03:01.074906 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.074663 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:03:01.077585 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.077542 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:03:01.077698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.077650 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:01.078656 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.078640 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:03:01.081079 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.081050 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:01.082948 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.082933 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:03:01.083031 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.082971 2566 server.go:1295] "Started kubelet" Apr 16 16:03:01.083083 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.083013 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:03:01.083153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.083121 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:03:01.083188 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.083173 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:03:01.083771 ip-10-0-136-220 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:03:01.084531 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.084516 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:03:01.085228 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.085213 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:03:01.089068 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.089048 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-136-220.ec2.internal" not found Apr 16 16:03:01.092453 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.092436 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:03:01.092547 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.092441 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:03:01.092547 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.092520 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:03:01.093157 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.093141 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:03:01.093157 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.093144 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:03:01.093247 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.093167 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:03:01.093288 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.093266 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:03:01.093288 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.093273 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:03:01.093728 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.093695 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-220.ec2.internal\" not found" Apr 16 16:03:01.094060 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094021 2566 factory.go:55] Registering systemd factory Apr 16 16:03:01.094060 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094037 2566 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:03:01.094575 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094561 2566 factory.go:153] Registering CRI-O factory Apr 16 16:03:01.094709 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094698 2566 factory.go:223] Registration of the crio container factory successfully Apr 16 16:03:01.094775 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094662 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:01.094830 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094815 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:03:01.094883 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094839 2566 factory.go:103] Registering Raw factory Apr 16 16:03:01.094883 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.094857 2566 manager.go:1196] Started watching for new ooms in manager Apr 16 16:03:01.095291 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.095263 2566 manager.go:319] Starting recovery of all containers Apr 16 16:03:01.097532 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.097508 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-220.ec2.internal\" not found" node="ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.102499 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.102356 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:03:01.104968 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.104840 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-136-220.ec2.internal" not found Apr 16 16:03:01.105115 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.105100 2566 manager.go:324] Recovery completed Apr 16 16:03:01.110932 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.110914 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:01.114947 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.114929 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:01.115016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.114963 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:01.115016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.114973 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:01.115479 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.115467 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:03:01.115537 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.115480 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:03:01.115537 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.115508 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:03:01.118572 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.118557 2566 policy_none.go:49] "None policy: Start" Apr 16 16:03:01.118666 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.118576 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:03:01.118666 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.118590 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:03:01.156233 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156214 2566 manager.go:341] "Starting Device Plugin manager" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.156267 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156281 2566 server.go:85] "Starting device plugin registration server" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156593 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156609 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156749 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156831 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.156842 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.157501 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.157544 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-220.ec2.internal\" not found" Apr 16 16:03:01.175892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.161016 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-136-220.ec2.internal" not found Apr 16 16:03:01.198704 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.198677 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:03:01.198847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.198712 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:03:01.198847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.198731 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:03:01.198847 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.198737 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:03:01.198847 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:01.198770 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:03:01.203008 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.202987 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:01.256928 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.256863 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:03:01.257935 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.257918 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:03:01.258029 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.257948 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:03:01.258029 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.257960 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:03:01.258029 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.257981 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.267299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.267278 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.299134 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.299103 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal"] Apr 16 16:03:01.302957 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.302935 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.302957 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.302946 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.326270 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.326244 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.328639 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.328607 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.346650 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.346609 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:03:01.359036 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.359014 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:03:01.394994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.394959 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.495442 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.495404 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.495442 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.495444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.495667 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.495464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba662560bc4f4ac2db429e0e7b17ae15-config\") pod \"kube-apiserver-proxy-ip-10-0-136-220.ec2.internal\" (UID: \"ba662560bc4f4ac2db429e0e7b17ae15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.495667 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.495526 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.596032 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.595935 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.596032 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.595969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba662560bc4f4ac2db429e0e7b17ae15-config\") pod \"kube-apiserver-proxy-ip-10-0-136-220.ec2.internal\" (UID: \"ba662560bc4f4ac2db429e0e7b17ae15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.596032 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.596017 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ba662560bc4f4ac2db429e0e7b17ae15-config\") pod \"kube-apiserver-proxy-ip-10-0-136-220.ec2.internal\" (UID: \"ba662560bc4f4ac2db429e0e7b17ae15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.596240 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.596046 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a79d77c3a5cce0b825c16bee49a58996-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal\" (UID: \"a79d77c3a5cce0b825c16bee49a58996\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.651049 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.651002 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.662589 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.662566 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" Apr 16 16:03:01.983986 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.983909 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:03:01.984598 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.984079 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:03:01.984598 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.984086 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:03:01.984598 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:01.984079 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:03:02.066020 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.065990 2566 apiserver.go:52] "Watching apiserver" Apr 16 16:03:02.071266 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.071224 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:58:01 +0000 UTC" deadline="2027-10-24 22:36:09.811992411 +0000 UTC" Apr 16 16:03:02.071266 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.071254 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13350h33m7.740740911s" Apr 16 16:03:02.073821 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.073806 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:03:02.074170 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.074150 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-9wz49","kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm","openshift-dns/node-resolver-mjkzm","openshift-image-registry/node-ca-kl4jm","openshift-multus/multus-vscz2","openshift-network-operator/iptables-alerter-lhzgg","openshift-ovn-kubernetes/ovnkube-node-bkhjf","openshift-cluster-node-tuning-operator/tuned-72bdw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal","openshift-multus/multus-additional-cni-plugins-pmzx9","openshift-multus/network-metrics-daemon-4kkrg","openshift-network-diagnostics/network-check-target-gvxkh"] Apr 16 16:03:02.077361 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.077346 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.080421 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.080368 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.080421 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.080387 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:03:02.080421 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.080417 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:03:02.080659 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.080503 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5h28f\"" Apr 16 16:03:02.081749 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.081730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.082347 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.082331 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:03:02.082732 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.082718 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-crzgr\"" Apr 16 16:03:02.082950 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.082933 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.083032 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.082962 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.083387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.083373 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.083706 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.083693 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.083902 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.083888 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.084004 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.083991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bfcxf\"" Apr 16 16:03:02.085054 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085024 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.085157 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.085632 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085600 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lvrj\"" Apr 16 16:03:02.085720 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085634 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.085720 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085666 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.085843 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.085638 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:03:02.086860 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.086844 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.087319 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.087298 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.087400 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.087373 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.087681 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.087664 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:03:02.088243 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088225 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.088588 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088571 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:03:02.088695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088598 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.088695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088654 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:03:02.088695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088654 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dpq8v\"" Apr 16 16:03:02.088821 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088700 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.088821 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.088743 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rrbwl\"" Apr 16 16:03:02.089270 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.089258 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:03:02.089544 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.089531 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.089927 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.089912 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8nt5q\"" Apr 16 16:03:02.089998 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.089918 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:03:02.090053 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090005 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:03:02.090243 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090230 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.091025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090745 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.091025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090826 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:03:02.091025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090853 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:03:02.091025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090831 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:03:02.091025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.090882 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bbkpr\"" Apr 16 16:03:02.091854 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.091838 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.091937 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.091920 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:02.092430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.092412 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:03:02.092575 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.092559 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:03:02.092657 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.092581 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:03:02.092745 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.092727 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kkxt5\"" Apr 16 16:03:02.093630 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.093601 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:02.093773 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.093749 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:02.099491 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099745 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-netns\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099857 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcv4t\" (UniqueName: \"kubernetes.io/projected/eb39d18c-d897-41cd-b539-6c31f7f376e3-kube-api-access-pcv4t\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzg77\" (UniqueName: \"kubernetes.io/projected/3e5c15b5-f8c7-478b-a327-14aad8952c3f-kube-api-access-bzg77\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48f3d044-df7a-4aec-b463-07afb7514443-agent-certs\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.099977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-tuned\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100019 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.100148 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5bb\" (UniqueName: \"kubernetes.io/projected/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-kube-api-access-zl5bb\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-systemd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100225 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-config\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100274 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-etc-kubernetes\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100373 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100421 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-ovn\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100450 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-sys\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.100530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100516 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-slash\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrfr\" (UniqueName: \"kubernetes.io/projected/19e9bbf4-2f93-4060-aa4f-2838412f8254-kube-api-access-4hrfr\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100641 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10efae8-7168-40df-b502-62b0d2d36756-host-slash\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzf77\" (UniqueName: \"kubernetes.io/projected/f10efae8-7168-40df-b502-62b0d2d36756-kube-api-access-rzf77\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100708 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100743 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-os-release\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-socket-dir-parent\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-system-cni-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.100977 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100911 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.100972 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-netd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-systemd\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101108 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-device-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101167 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19e9bbf4-2f93-4060-aa4f-2838412f8254-host\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101210 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-etc-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-modprobe-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101265 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-multus\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101297 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-kubelet\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.101354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101325 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-multus-certs\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101363 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101399 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-hosts-file\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101422 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-log-socket\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101451 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06eb6e48-21ae-44ee-bf36-e4206b109746-ovn-node-metrics-cert\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19e9bbf4-2f93-4060-aa4f-2838412f8254-serviceca\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-bin\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101547 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-script-lib\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phpkw\" (UniqueName: \"kubernetes.io/projected/06eb6e48-21ae-44ee-bf36-e4206b109746-kube-api-access-phpkw\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-conf\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101667 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-var-lib-kubelet\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101696 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjsww\" (UniqueName: \"kubernetes.io/projected/2f64ce22-0fab-4753-8319-62ac8a354b24-kube-api-access-zjsww\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101729 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.101769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-system-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101782 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-tmp-dir\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8zg\" (UniqueName: \"kubernetes.io/projected/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kube-api-access-cg8zg\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cni-binary-copy\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101868 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f10efae8-7168-40df-b502-62b0d2d36756-iptables-alerter-script\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-netns\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101922 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101950 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-bin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.101978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-cnibin\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102007 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102035 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-lib-modules\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-hostroot\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102125 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-conf-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznpp\" (UniqueName: \"kubernetes.io/projected/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-kube-api-access-tznpp\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102183 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-os-release\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.102299 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102220 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-node-log\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-daemon-config\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102306 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48f3d044-df7a-4aec-b463-07afb7514443-konnectivity-ca\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-kubernetes\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102452 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-k8s-cni-cncf-io\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102495 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-kubelet\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102544 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-var-lib-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102603 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-env-overrides\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102942 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-host\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.102983 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.102885 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103000 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-systemd-units\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103121 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysconfig\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103160 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cnibin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103190 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.103403 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-run\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.103706 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.103417 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-tmp\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.126920 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.126896 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-blwq6" Apr 16 16:03:02.138923 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.138897 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-blwq6" Apr 16 16:03:02.180445 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.180407 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba662560bc4f4ac2db429e0e7b17ae15.slice/crio-d266b8d22abed59ddc4edb9cbdddf9381bed2c5e90b9f2807c7ba9ce8e3c209b WatchSource:0}: Error finding container d266b8d22abed59ddc4edb9cbdddf9381bed2c5e90b9f2807c7ba9ce8e3c209b: Status 404 returned error can't find the container with id d266b8d22abed59ddc4edb9cbdddf9381bed2c5e90b9f2807c7ba9ce8e3c209b Apr 16 16:03:02.189446 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.189429 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:03:02.194485 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.194465 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:03:02.199239 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.199212 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79d77c3a5cce0b825c16bee49a58996.slice/crio-5cdc024c9a6cda5b69351e55a0c89546144d2329a5f5c5aec32c6141addaf4b4 WatchSource:0}: Error finding container 5cdc024c9a6cda5b69351e55a0c89546144d2329a5f5c5aec32c6141addaf4b4: Status 404 returned error can't find the container with id 5cdc024c9a6cda5b69351e55a0c89546144d2329a5f5c5aec32c6141addaf4b4 Apr 16 16:03:02.201436 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.201396 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" event={"ID":"a79d77c3a5cce0b825c16bee49a58996","Type":"ContainerStarted","Data":"5cdc024c9a6cda5b69351e55a0c89546144d2329a5f5c5aec32c6141addaf4b4"} Apr 16 16:03:02.202679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.202591 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" event={"ID":"ba662560bc4f4ac2db429e0e7b17ae15","Type":"ContainerStarted","Data":"d266b8d22abed59ddc4edb9cbdddf9381bed2c5e90b9f2807c7ba9ce8e3c209b"} Apr 16 16:03:02.203842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203821 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.203896 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-netns\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.203896 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcv4t\" (UniqueName: \"kubernetes.io/projected/eb39d18c-d897-41cd-b539-6c31f7f376e3-kube-api-access-pcv4t\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.203896 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203888 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzg77\" (UniqueName: \"kubernetes.io/projected/3e5c15b5-f8c7-478b-a327-14aad8952c3f-kube-api-access-bzg77\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.204016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203937 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-netns\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.204016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203940 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.204016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203963 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48f3d044-df7a-4aec-b463-07afb7514443-agent-certs\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.204016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.203996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-tuned\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.204016 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204036 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204064 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5bb\" (UniqueName: \"kubernetes.io/projected/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-kube-api-access-zl5bb\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-systemd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-config\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-etc-kubernetes\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204169 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-systemd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204190 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-ovn\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204347 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204371 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204430 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-ovn\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.204431 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204466 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-etc-kubernetes\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.204554 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:02.704500518 +0000 UTC m=+2.145029045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204559 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.204841 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204749 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-sys\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-slash\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrfr\" (UniqueName: \"kubernetes.io/projected/19e9bbf4-2f93-4060-aa4f-2838412f8254-kube-api-access-4hrfr\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204870 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10efae8-7168-40df-b502-62b0d2d36756-host-slash\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-sys\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzf77\" (UniqueName: \"kubernetes.io/projected/f10efae8-7168-40df-b502-62b0d2d36756-kube-api-access-rzf77\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204961 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-os-release\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.204989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-socket-dir-parent\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205014 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-system-cni-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205056 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-netd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-systemd\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-device-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19e9bbf4-2f93-4060-aa4f-2838412f8254-host\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205139 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-etc-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.205695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-os-release\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-config\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205183 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-etc-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205181 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-slash\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205196 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10efae8-7168-40df-b502-62b0d2d36756-host-slash\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-modprobe-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205242 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-socket-dir-parent\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205245 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-multus\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-kubelet\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205285 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205295 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-multus-certs\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205327 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205338 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-modprobe-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205349 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-device-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205348 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-system-cni-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-hosts-file\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205396 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19e9bbf4-2f93-4060-aa4f-2838412f8254-host\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.206583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-log-socket\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-multus\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205415 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-hosts-file\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205429 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06eb6e48-21ae-44ee-bf36-e4206b109746-ovn-node-metrics-cert\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-systemd\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205457 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-netd\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-kubelet\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205487 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19e9bbf4-2f93-4060-aa4f-2838412f8254-serviceca\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205508 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-log-socket\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205521 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-bin\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205406 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-multus-certs\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205545 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-script-lib\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phpkw\" (UniqueName: \"kubernetes.io/projected/06eb6e48-21ae-44ee-bf36-e4206b109746-kube-api-access-phpkw\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205640 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-conf\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205670 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-var-lib-kubelet\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205691 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjsww\" (UniqueName: \"kubernetes.io/projected/2f64ce22-0fab-4753-8319-62ac8a354b24-kube-api-access-zjsww\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.207452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205742 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-system-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-tmp-dir\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8zg\" (UniqueName: \"kubernetes.io/projected/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kube-api-access-cg8zg\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205824 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cni-binary-copy\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205851 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f10efae8-7168-40df-b502-62b0d2d36756-iptables-alerter-script\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205883 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-netns\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205895 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19e9bbf4-2f93-4060-aa4f-2838412f8254-serviceca\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205910 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205886 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-bin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.205973 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-cnibin\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206028 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-lib-modules\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-hostroot\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206119 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-conf-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206137 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-conf\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tznpp\" (UniqueName: \"kubernetes.io/projected/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-kube-api-access-tznpp\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206146 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-cni-bin\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-os-release\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206444 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-node-log\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-daemon-config\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206523 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48f3d044-df7a-4aec-b463-07afb7514443-konnectivity-ca\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206549 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-kubernetes\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206574 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-k8s-cni-cncf-io\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-kubelet\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206649 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-var-lib-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206674 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-env-overrides\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206696 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-ovnkube-script-lib\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-os-release\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.208908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-host\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-node-log\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cni-binary-copy\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206945 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysctl-d\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207007 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-lib-modules\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-tmp-dir\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207085 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-system-cni-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.206700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-host\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207146 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-hostroot\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207209 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-daemon-config\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207229 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-multus-conf-dir\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207273 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207438 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f10efae8-7168-40df-b502-62b0d2d36756-iptables-alerter-script\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207508 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-run-netns\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207560 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-systemd-units\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.209679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207611 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb39d18c-d897-41cd-b539-6c31f7f376e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207653 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-var-lib-cni-bin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207684 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-systemd-units\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207701 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysconfig\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-host-run-k8s-cni-cncf-io\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-kubernetes\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207799 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb39d18c-d897-41cd-b539-6c31f7f376e3-cnibin\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207655 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-sysconfig\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cnibin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207848 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-run\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207876 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-tmp\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207928 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-var-lib-kubelet\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.207986 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-cnibin\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-host-kubelet\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-run-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06eb6e48-21ae-44ee-bf36-e4206b109746-env-overrides\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48f3d044-df7a-4aec-b463-07afb7514443-agent-certs\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.210405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208260 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06eb6e48-21ae-44ee-bf36-e4206b109746-var-lib-openvswitch\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.211135 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06eb6e48-21ae-44ee-bf36-e4206b109746-ovn-node-metrics-cert\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.211135 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-etc-tuned\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.211135 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208590 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48f3d044-df7a-4aec-b463-07afb7514443-konnectivity-ca\") pod \"konnectivity-agent-9wz49\" (UID: \"48f3d044-df7a-4aec-b463-07afb7514443\") " pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.211135 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.208683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2f64ce22-0fab-4753-8319-62ac8a354b24-run\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.211135 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.210404 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f64ce22-0fab-4753-8319-62ac8a354b24-tmp\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.214681 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.214662 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:02.214681 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.214682 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:02.214832 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.214692 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.214832 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.214735 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:02.714721029 +0000 UTC m=+2.155249462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.216955 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.216935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phpkw\" (UniqueName: \"kubernetes.io/projected/06eb6e48-21ae-44ee-bf36-e4206b109746-kube-api-access-phpkw\") pod \"ovnkube-node-bkhjf\" (UID: \"06eb6e48-21ae-44ee-bf36-e4206b109746\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.217050 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.217002 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8zg\" (UniqueName: \"kubernetes.io/projected/8ef3fea7-640f-4dba-bdd8-5a484f18ccfa-kube-api-access-cg8zg\") pod \"aws-ebs-csi-driver-node-7fcwm\" (UID: \"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.217294 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.217276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjsww\" (UniqueName: \"kubernetes.io/projected/2f64ce22-0fab-4753-8319-62ac8a354b24-kube-api-access-zjsww\") pod \"tuned-72bdw\" (UID: \"2f64ce22-0fab-4753-8319-62ac8a354b24\") " pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.217468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.217447 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzf77\" (UniqueName: \"kubernetes.io/projected/f10efae8-7168-40df-b502-62b0d2d36756-kube-api-access-rzf77\") pod \"iptables-alerter-lhzgg\" (UID: \"f10efae8-7168-40df-b502-62b0d2d36756\") " pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.217939 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.217924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznpp\" (UniqueName: \"kubernetes.io/projected/06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7-kube-api-access-tznpp\") pod \"multus-vscz2\" (UID: \"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7\") " pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.218132 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.218112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcv4t\" (UniqueName: \"kubernetes.io/projected/eb39d18c-d897-41cd-b539-6c31f7f376e3-kube-api-access-pcv4t\") pod \"multus-additional-cni-plugins-pmzx9\" (UID: \"eb39d18c-d897-41cd-b539-6c31f7f376e3\") " pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.221763 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.221742 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzg77\" (UniqueName: \"kubernetes.io/projected/3e5c15b5-f8c7-478b-a327-14aad8952c3f-kube-api-access-bzg77\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.222475 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.222448 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrfr\" (UniqueName: \"kubernetes.io/projected/19e9bbf4-2f93-4060-aa4f-2838412f8254-kube-api-access-4hrfr\") pod \"node-ca-kl4jm\" (UID: \"19e9bbf4-2f93-4060-aa4f-2838412f8254\") " pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.222552 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.222519 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5bb\" (UniqueName: \"kubernetes.io/projected/b54e0816-7a3e-49eb-bc50-eebcbb3a03c2-kube-api-access-zl5bb\") pod \"node-resolver-mjkzm\" (UID: \"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2\") " pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.407267 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.407173 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:02.414309 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.414287 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f3d044_df7a_4aec_b463_07afb7514443.slice/crio-83577b216594877444a0747c135c6633b0c4a7a60ca2750d3818e7289cacace5 WatchSource:0}: Error finding container 83577b216594877444a0747c135c6633b0c4a7a60ca2750d3818e7289cacace5: Status 404 returned error can't find the container with id 83577b216594877444a0747c135c6633b0c4a7a60ca2750d3818e7289cacace5 Apr 16 16:03:02.420336 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.420315 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" Apr 16 16:03:02.425933 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.425909 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef3fea7_640f_4dba_bdd8_5a484f18ccfa.slice/crio-addcfe0a307b9fca759dc47739c23a428f6c0a82620b3a3dbb6090b035f1dd0b WatchSource:0}: Error finding container addcfe0a307b9fca759dc47739c23a428f6c0a82620b3a3dbb6090b035f1dd0b: Status 404 returned error can't find the container with id addcfe0a307b9fca759dc47739c23a428f6c0a82620b3a3dbb6090b035f1dd0b Apr 16 16:03:02.427237 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.427220 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjkzm" Apr 16 16:03:02.434887 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.434852 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb54e0816_7a3e_49eb_bc50_eebcbb3a03c2.slice/crio-9e3d037f014e6f8e0d4f52ecd0364b7d40697630142efe06d1f2d7aba1337571 WatchSource:0}: Error finding container 9e3d037f014e6f8e0d4f52ecd0364b7d40697630142efe06d1f2d7aba1337571: Status 404 returned error can't find the container with id 9e3d037f014e6f8e0d4f52ecd0364b7d40697630142efe06d1f2d7aba1337571 Apr 16 16:03:02.443748 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.443731 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kl4jm" Apr 16 16:03:02.449128 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.449109 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e9bbf4_2f93_4060_aa4f_2838412f8254.slice/crio-0e92cf6a1b1cf1f8c3e109f17d95ed77453d967e960b6ecf92d9f4c53083ea79 WatchSource:0}: Error finding container 0e92cf6a1b1cf1f8c3e109f17d95ed77453d967e960b6ecf92d9f4c53083ea79: Status 404 returned error can't find the container with id 0e92cf6a1b1cf1f8c3e109f17d95ed77453d967e960b6ecf92d9f4c53083ea79 Apr 16 16:03:02.462916 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.462897 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vscz2" Apr 16 16:03:02.468292 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.468272 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e6e0e3_2b76_41df_bcbc_8f60a89cd7e7.slice/crio-239a6a4bcdc6ace8924dc4d9ed3a896e06da6e03d3c5a81fab6c03d0d8877f2e WatchSource:0}: Error finding container 239a6a4bcdc6ace8924dc4d9ed3a896e06da6e03d3c5a81fab6c03d0d8877f2e: Status 404 returned error can't find the container with id 239a6a4bcdc6ace8924dc4d9ed3a896e06da6e03d3c5a81fab6c03d0d8877f2e Apr 16 16:03:02.479607 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.479590 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lhzgg" Apr 16 16:03:02.485544 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.485523 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10efae8_7168_40df_b502_62b0d2d36756.slice/crio-96eec3020ef0bb31e39d05075a4958ba4db9d45684ecbd9aa4612a8b9568aeb5 WatchSource:0}: Error finding container 96eec3020ef0bb31e39d05075a4958ba4db9d45684ecbd9aa4612a8b9568aeb5: Status 404 returned error can't find the container with id 96eec3020ef0bb31e39d05075a4958ba4db9d45684ecbd9aa4612a8b9568aeb5 Apr 16 16:03:02.487129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.487111 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:02.492552 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.492535 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06eb6e48_21ae_44ee_bf36_e4206b109746.slice/crio-a65aa8e5f20c3a75820c02bd05a33bed42401dd9b0df982fadd3928f499e0f64 WatchSource:0}: Error finding container a65aa8e5f20c3a75820c02bd05a33bed42401dd9b0df982fadd3928f499e0f64: Status 404 returned error can't find the container with id a65aa8e5f20c3a75820c02bd05a33bed42401dd9b0df982fadd3928f499e0f64 Apr 16 16:03:02.493275 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.493260 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-72bdw" Apr 16 16:03:02.498605 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.498588 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" Apr 16 16:03:02.499445 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.499327 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f64ce22_0fab_4753_8319_62ac8a354b24.slice/crio-b06262efb2d5667c5e46551d2ad2f443fdb0eec4cb352ba124f2608316b8be4b WatchSource:0}: Error finding container b06262efb2d5667c5e46551d2ad2f443fdb0eec4cb352ba124f2608316b8be4b: Status 404 returned error can't find the container with id b06262efb2d5667c5e46551d2ad2f443fdb0eec4cb352ba124f2608316b8be4b Apr 16 16:03:02.505350 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:02.505331 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb39d18c_d897_41cd_b539_6c31f7f376e3.slice/crio-b24749dcc9f4891322aec94cc99101dde4401e8619145266863d169c41135cf6 WatchSource:0}: Error finding container b24749dcc9f4891322aec94cc99101dde4401e8619145266863d169c41135cf6: Status 404 returned error can't find the container with id b24749dcc9f4891322aec94cc99101dde4401e8619145266863d169c41135cf6 Apr 16 16:03:02.713162 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.713001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:02.713162 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.713160 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.713449 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.713223 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:03.713205907 +0000 UTC m=+3.153734357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.814198 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:02.814158 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:02.814373 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.814358 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:02.814489 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.814476 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:02.814549 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.814497 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.814605 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:02.814565 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:03.81454421 +0000 UTC m=+3.255072658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:03.140374 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.140267 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:58:02 +0000 UTC" deadline="2027-12-06 01:53:01.431834995 +0000 UTC" Apr 16 16:03:03.140374 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.140309 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14361h49m58.291530366s" Apr 16 16:03:03.201875 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.201844 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:03.202060 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.201970 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:03.208904 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.204842 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:03.219842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.219805 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerStarted","Data":"b24749dcc9f4891322aec94cc99101dde4401e8619145266863d169c41135cf6"} Apr 16 16:03:03.233674 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.233572 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"a65aa8e5f20c3a75820c02bd05a33bed42401dd9b0df982fadd3928f499e0f64"} Apr 16 16:03:03.251934 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.251865 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vscz2" event={"ID":"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7","Type":"ContainerStarted","Data":"239a6a4bcdc6ace8924dc4d9ed3a896e06da6e03d3c5a81fab6c03d0d8877f2e"} Apr 16 16:03:03.265379 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.265326 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kl4jm" event={"ID":"19e9bbf4-2f93-4060-aa4f-2838412f8254","Type":"ContainerStarted","Data":"0e92cf6a1b1cf1f8c3e109f17d95ed77453d967e960b6ecf92d9f4c53083ea79"} Apr 16 16:03:03.285584 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.285456 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" event={"ID":"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa","Type":"ContainerStarted","Data":"addcfe0a307b9fca759dc47739c23a428f6c0a82620b3a3dbb6090b035f1dd0b"} Apr 16 16:03:03.291185 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.291152 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-72bdw" event={"ID":"2f64ce22-0fab-4753-8319-62ac8a354b24","Type":"ContainerStarted","Data":"b06262efb2d5667c5e46551d2ad2f443fdb0eec4cb352ba124f2608316b8be4b"} Apr 16 16:03:03.297020 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.296987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lhzgg" event={"ID":"f10efae8-7168-40df-b502-62b0d2d36756","Type":"ContainerStarted","Data":"96eec3020ef0bb31e39d05075a4958ba4db9d45684ecbd9aa4612a8b9568aeb5"} Apr 16 16:03:03.329187 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.329047 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjkzm" event={"ID":"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2","Type":"ContainerStarted","Data":"9e3d037f014e6f8e0d4f52ecd0364b7d40697630142efe06d1f2d7aba1337571"} Apr 16 16:03:03.337409 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.337358 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9wz49" event={"ID":"48f3d044-df7a-4aec-b463-07afb7514443","Type":"ContainerStarted","Data":"83577b216594877444a0747c135c6633b0c4a7a60ca2750d3818e7289cacace5"} Apr 16 16:03:03.431850 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.431767 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:03.581480 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.581445 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:03:03.724283 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.724195 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:03.724455 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.724378 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:03.724455 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.724444 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:05.724423929 +0000 UTC m=+5.164952375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:03.826727 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:03.826689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:03.826903 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.826888 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:03.826969 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.826908 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:03.826969 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.826921 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:03.827083 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:03.826980 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:05.826962265 +0000 UTC m=+5.267490713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:04.140882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:04.140791 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:58:02 +0000 UTC" deadline="2027-11-15 03:55:56.560295932 +0000 UTC" Apr 16 16:03:04.140882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:04.140834 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13859h52m52.419466136s" Apr 16 16:03:04.199524 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:04.199466 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:04.199715 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:04.199649 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:05.202049 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:05.202015 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:05.202610 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.202148 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:05.744311 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:05.744269 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:05.744492 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.744475 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.744591 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.744543 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:09.744522305 +0000 UTC m=+9.185050753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:05.845452 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:05.845415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:05.845654 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.845583 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:05.845654 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.845601 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:05.845654 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.845629 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:05.845875 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:05.845686 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:09.845668329 +0000 UTC m=+9.286196767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:06.199707 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:06.199673 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:06.199920 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:06.199835 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:07.199864 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:07.199709 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:07.200320 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:07.199851 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:08.199702 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:08.199661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:08.199894 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:08.199811 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:09.199687 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:09.199654 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:09.199868 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.199779 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:09.776042 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:09.775959 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:09.776458 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.776122 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:09.776458 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.776200 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:17.77617748 +0000 UTC m=+17.216705929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:09.876388 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:09.876349 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:09.876562 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.876540 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:09.876562 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.876562 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:09.876770 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.876575 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:09.876770 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:09.876674 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:17.87663542 +0000 UTC m=+17.317163855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:10.199823 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:10.199784 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:10.200009 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:10.199930 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:11.200435 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:11.200396 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:11.200919 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:11.200517 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:12.199700 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:12.199663 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:12.199997 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:12.199814 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:13.199926 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:13.199886 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:13.200365 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:13.200017 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:14.199801 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:14.199770 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:14.199953 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:14.199892 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:15.199472 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:15.199434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:15.199675 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:15.199545 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:16.199345 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:16.199305 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:16.199815 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:16.199460 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:17.199731 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:17.199508 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:17.200180 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.199838 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:17.831368 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:17.831337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:17.831597 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.831435 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:17.831597 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.831510 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.831488508 +0000 UTC m=+33.272016958 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:17.932493 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:17.932465 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:17.932699 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.932661 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:17.932699 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.932680 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:17.932699 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.932689 2566 projected.go:194] Error preparing data for projected volume kube-api-access-zmgxx for pod openshift-network-diagnostics/network-check-target-gvxkh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:17.932852 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:17.932746 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx podName:99da9992-0d67-494e-853e-a94744056361 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.932730462 +0000 UTC m=+33.373258900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zmgxx" (UniqueName: "kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx") pod "network-check-target-gvxkh" (UID: "99da9992-0d67-494e-853e-a94744056361") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:18.199257 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:18.199220 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:18.199444 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:18.199354 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:19.199911 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:19.199874 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:19.200289 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:19.200009 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:20.199377 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:20.199339 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:20.199558 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:20.199476 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:21.199732 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.199500 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:21.200323 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:21.199748 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:21.399700 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.399516 2566 generic.go:358] "Generic (PLEG): container finished" podID="a79d77c3a5cce0b825c16bee49a58996" containerID="2ac3047d7f628ac6870333fc7de6d7f41f0f6ed44c61ee7ea591d9ac7c6db1c4" exitCode=0 Apr 16 16:03:21.399842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.399607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" event={"ID":"a79d77c3a5cce0b825c16bee49a58996","Type":"ContainerDied","Data":"2ac3047d7f628ac6870333fc7de6d7f41f0f6ed44c61ee7ea591d9ac7c6db1c4"} Apr 16 16:03:21.400998 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.400971 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="875181c9c7ac7bf6026eb028bd88a80c3f69010a12dc8fa8b3a6f3ff1c916e23" exitCode=0 Apr 16 16:03:21.401136 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.401061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"875181c9c7ac7bf6026eb028bd88a80c3f69010a12dc8fa8b3a6f3ff1c916e23"} Apr 16 16:03:21.403656 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.403636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:03:21.403980 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.403953 2566 generic.go:358] "Generic (PLEG): container finished" podID="06eb6e48-21ae-44ee-bf36-e4206b109746" containerID="0a1508724185325c7b00be166e8365439f8ce93bf263d370b72942659c8c11c3" exitCode=1 Apr 16 16:03:21.404035 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404020 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"39b00fdfbe97166b9dc987547718254d7af86fb2f582635796ed618bebe0ccc3"} Apr 16 16:03:21.404082 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404048 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"be92ab7c2f7bd59424b38b184686b3a22cdc5f41a45920a00d9f8c0838fe6594"} Apr 16 16:03:21.404082 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"0872d7e86319d710e8ad02e61ae38153ea5b99b753fa04e6441a260a1fa44523"} Apr 16 16:03:21.404082 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404074 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"efadf88d238d60d0753f3de351e37167b0fbbcec04538e186fa11295d36abe83"} Apr 16 16:03:21.404178 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404085 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerDied","Data":"0a1508724185325c7b00be166e8365439f8ce93bf263d370b72942659c8c11c3"} Apr 16 16:03:21.404178 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.404100 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"238763f134cfa2aa473043eea94b5b3e3032bac528a9756e48e97f71c1d1b624"} Apr 16 16:03:21.405266 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.405247 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vscz2" event={"ID":"06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7","Type":"ContainerStarted","Data":"a4797af19d6a9f0bb3d1e26681a8c49eba9384069370a3dc8e071d4b7dd21056"} Apr 16 16:03:21.406485 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.406462 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kl4jm" event={"ID":"19e9bbf4-2f93-4060-aa4f-2838412f8254","Type":"ContainerStarted","Data":"58e68049a609b94d5af588eb1d8b5310300ca3a72d42b3d96c408b17c2ca2de8"} Apr 16 16:03:21.407661 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.407639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" event={"ID":"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa","Type":"ContainerStarted","Data":"3ff273cfcb30f979c306dfd9a8c0be2249754d774344cbaac5b9f6e555a0d10d"} Apr 16 16:03:21.408714 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.408692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" event={"ID":"ba662560bc4f4ac2db429e0e7b17ae15","Type":"ContainerStarted","Data":"e1fe85cd78a391cd81f308c6de0ba424037c2a106811f3b87606ccd9fcecd2f7"} Apr 16 16:03:21.409859 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.409840 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-72bdw" event={"ID":"2f64ce22-0fab-4753-8319-62ac8a354b24","Type":"ContainerStarted","Data":"1a9840b0fe46599a9ec494efeb564a756d3a42c78d03723fff0a686714f7607b"} Apr 16 16:03:21.413313 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.413288 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjkzm" event={"ID":"b54e0816-7a3e-49eb-bc50-eebcbb3a03c2","Type":"ContainerStarted","Data":"d9424ab89f1e7683963b849939a2126a666978cae2ce823d1807d0128368322c"} Apr 16 16:03:21.414632 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.414581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9wz49" event={"ID":"48f3d044-df7a-4aec-b463-07afb7514443","Type":"ContainerStarted","Data":"8a11542a7db4c6a799469270c7d4206f54f56f1cc17fcd6f93d7ceee848762e3"} Apr 16 16:03:21.432206 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.432152 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-220.ec2.internal" podStartSLOduration=20.432133981 podStartE2EDuration="20.432133981s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:21.431577441 +0000 UTC m=+20.872105886" watchObservedRunningTime="2026-04-16 16:03:21.432133981 +0000 UTC m=+20.872662436" Apr 16 16:03:21.479139 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.478568 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vscz2" podStartSLOduration=2.653270094 podStartE2EDuration="20.478554167s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.469755013 +0000 UTC m=+1.910283446" lastFinishedPulling="2026-04-16 16:03:20.295039086 +0000 UTC m=+19.735567519" observedRunningTime="2026-04-16 16:03:21.478242914 +0000 UTC m=+20.918771369" watchObservedRunningTime="2026-04-16 16:03:21.478554167 +0000 UTC m=+20.919082622" Apr 16 16:03:21.479443 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.479409 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mjkzm" podStartSLOduration=2.628764322 podStartE2EDuration="20.479397999s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.436722254 +0000 UTC m=+1.877250688" lastFinishedPulling="2026-04-16 16:03:20.28735592 +0000 UTC m=+19.727884365" observedRunningTime="2026-04-16 16:03:21.454554602 +0000 UTC m=+20.895083069" watchObservedRunningTime="2026-04-16 16:03:21.479397999 +0000 UTC m=+20.919926453" Apr 16 16:03:21.531315 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.531274 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-72bdw" podStartSLOduration=2.744617907 podStartE2EDuration="20.53125959s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.5024855 +0000 UTC m=+1.943013933" lastFinishedPulling="2026-04-16 16:03:20.28912718 +0000 UTC m=+19.729655616" observedRunningTime="2026-04-16 16:03:21.50066937 +0000 UTC m=+20.941197828" watchObservedRunningTime="2026-04-16 16:03:21.53125959 +0000 UTC m=+20.971788045" Apr 16 16:03:21.547161 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.547112 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9wz49" podStartSLOduration=2.726755152 podStartE2EDuration="20.547094015s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.415900064 +0000 UTC m=+1.856428498" lastFinishedPulling="2026-04-16 16:03:20.236238915 +0000 UTC m=+19.676767361" observedRunningTime="2026-04-16 16:03:21.54669639 +0000 UTC m=+20.987224865" watchObservedRunningTime="2026-04-16 16:03:21.547094015 +0000 UTC m=+20.987622471" Apr 16 16:03:21.579108 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.578786 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kl4jm" podStartSLOduration=2.742119756 podStartE2EDuration="20.578766716s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.45052697 +0000 UTC m=+1.891055403" lastFinishedPulling="2026-04-16 16:03:20.28717393 +0000 UTC m=+19.727702363" observedRunningTime="2026-04-16 16:03:21.578392992 +0000 UTC m=+21.018921447" watchObservedRunningTime="2026-04-16 16:03:21.578766716 +0000 UTC m=+21.019295174" Apr 16 16:03:21.929363 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:21.929319 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:03:22.168485 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.168384 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:03:21.929339179Z","UUID":"4b5f9cb5-9c91-4d81-bc97-51023a6948f0","Handler":null,"Name":"","Endpoint":""} Apr 16 16:03:22.170926 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.170899 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:03:22.171091 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.170936 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:03:22.199601 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.199566 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:22.199766 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:22.199710 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:22.419845 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.419085 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lhzgg" event={"ID":"f10efae8-7168-40df-b502-62b0d2d36756","Type":"ContainerStarted","Data":"fa80f47b5ec546c4b6040d4a2bb0966cc62190e4a8adbb18e220e14ecacd414d"} Apr 16 16:03:22.422832 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.422801 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" event={"ID":"a79d77c3a5cce0b825c16bee49a58996","Type":"ContainerStarted","Data":"d297f7bd265119f123718ade53fc5914a60bebedb5cf70afcb566ea6757a2795"} Apr 16 16:03:22.425227 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.425102 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" event={"ID":"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa","Type":"ContainerStarted","Data":"aec539d960ecb231e723dd3258294600bb56c8ccb874253dbfa0c221faaaf57b"} Apr 16 16:03:22.439115 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.438791 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lhzgg" podStartSLOduration=3.638861259 podStartE2EDuration="21.438774572s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.486974244 +0000 UTC m=+1.927502680" lastFinishedPulling="2026-04-16 16:03:20.286887547 +0000 UTC m=+19.727415993" observedRunningTime="2026-04-16 16:03:22.438244758 +0000 UTC m=+21.878773214" watchObservedRunningTime="2026-04-16 16:03:22.438774572 +0000 UTC m=+21.879303026" Apr 16 16:03:22.457583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:22.457527 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-220.ec2.internal" podStartSLOduration=21.457509739 podStartE2EDuration="21.457509739s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:22.457277056 +0000 UTC m=+21.897805522" watchObservedRunningTime="2026-04-16 16:03:22.457509739 +0000 UTC m=+21.898038199" Apr 16 16:03:23.199705 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:23.199664 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:23.199902 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:23.199789 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:23.429782 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:23.429748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:03:23.430164 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:23.430125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"30a714a0e4e44e45a723c0f876cf0f28098ed0f4adf74f117272d4396c8c72d3"} Apr 16 16:03:23.431999 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:23.431974 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" event={"ID":"8ef3fea7-640f-4dba-bdd8-5a484f18ccfa","Type":"ContainerStarted","Data":"bffef702a3a17af867eceb3ad7e66855201ea9c21bd8ba8fb4701950475e23ed"} Apr 16 16:03:23.472805 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:23.472700 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcwm" podStartSLOduration=2.153438771 podStartE2EDuration="22.472684993s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.427387957 +0000 UTC m=+1.867916400" lastFinishedPulling="2026-04-16 16:03:22.746634185 +0000 UTC m=+22.187162622" observedRunningTime="2026-04-16 16:03:23.472550536 +0000 UTC m=+22.913078992" watchObservedRunningTime="2026-04-16 16:03:23.472684993 +0000 UTC m=+22.913213447" Apr 16 16:03:24.199099 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:24.199067 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:24.199271 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:24.199192 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:24.739538 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:24.739503 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:24.740756 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:24.740735 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:25.199151 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:25.199119 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:25.199323 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:25.199250 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:25.438796 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:25.438577 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:03:26.199932 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.199900 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:26.200581 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:26.200002 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:26.442981 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.442941 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="3f887b82e25e2f344ff081bd09099250ea743bbe0312257fba6de8865f7c58aa" exitCode=0 Apr 16 16:03:26.443143 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.443019 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"3f887b82e25e2f344ff081bd09099250ea743bbe0312257fba6de8865f7c58aa"} Apr 16 16:03:26.446055 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.446037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:03:26.446361 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.446339 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"3cb3a1e848557fdc59fa458da784ae5e2193fbff01560a68c52db995447e02d5"} Apr 16 16:03:26.446660 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.446642 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:26.446721 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.446670 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:26.446771 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.446756 2566 scope.go:117] "RemoveContainer" containerID="0a1508724185325c7b00be166e8365439f8ce93bf263d370b72942659c8c11c3" Apr 16 16:03:26.461864 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.461840 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:26.621736 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.621697 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:26.621898 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.621816 2566 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:03:26.622503 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:26.622482 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9wz49" Apr 16 16:03:27.200416 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.199554 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:27.200416 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:27.199892 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:27.434720 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.434686 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gvxkh"] Apr 16 16:03:27.436871 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.436846 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4kkrg"] Apr 16 16:03:27.436981 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.436972 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:27.437105 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:27.437081 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:27.450154 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.450126 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="cc5505b57a78eec159ac7bc2cf4dc39abde0aec8e2c69c2dc923e3ef30e779a4" exitCode=0 Apr 16 16:03:27.450296 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.450193 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"cc5505b57a78eec159ac7bc2cf4dc39abde0aec8e2c69c2dc923e3ef30e779a4"} Apr 16 16:03:27.454198 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.454180 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:03:27.454627 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.454600 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:27.454679 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.454611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" event={"ID":"06eb6e48-21ae-44ee-bf36-e4206b109746","Type":"ContainerStarted","Data":"c9332499f6ff333e613f234bc067ab7c129bef1bb82c47a2859f94589eabb288"} Apr 16 16:03:27.454739 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:27.454719 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:27.455057 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.454898 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:27.470918 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.470898 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:03:27.508050 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:27.508006 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" podStartSLOduration=8.630863284 podStartE2EDuration="26.507993782s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.494108998 +0000 UTC m=+1.934637430" lastFinishedPulling="2026-04-16 16:03:20.371239488 +0000 UTC m=+19.811767928" observedRunningTime="2026-04-16 16:03:27.507914116 +0000 UTC m=+26.948442571" watchObservedRunningTime="2026-04-16 16:03:27.507993782 +0000 UTC m=+26.948522237" Apr 16 16:03:28.458251 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:28.458221 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="3dbdd86bf3b455f7b2db01737226c5be1a27ed9582b806d6a13245a98e6a9f0b" exitCode=0 Apr 16 16:03:28.458631 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:28.458299 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"3dbdd86bf3b455f7b2db01737226c5be1a27ed9582b806d6a13245a98e6a9f0b"} Apr 16 16:03:29.199633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:29.199590 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:29.199633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:29.199605 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:29.199831 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:29.199754 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:29.199919 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:29.199875 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:31.200091 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:31.200022 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:31.200532 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:31.200133 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:03:31.200532 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:31.200157 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:31.200532 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:31.200274 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gvxkh" podUID="99da9992-0d67-494e-853e-a94744056361" Apr 16 16:03:32.367298 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.367064 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-220.ec2.internal" event="NodeReady" Apr 16 16:03:32.367752 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.367371 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:03:32.416926 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.416891 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fm66n"] Apr 16 16:03:32.446910 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.446872 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pxsb9"] Apr 16 16:03:32.447063 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.446918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.449627 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.449592 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:03:32.449769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.449748 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:03:32.449938 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.449919 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:03:32.469226 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.469203 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fm66n"] Apr 16 16:03:32.469226 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.469232 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pxsb9"] Apr 16 16:03:32.469404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.469322 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:32.472355 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.472327 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:03:32.472491 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.472398 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:03:32.472491 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.472445 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:03:32.472830 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.472813 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:03:32.538792 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.538756 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045826c8-ee95-494f-8d06-d8d18b2717ca-tmp-dir\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.538792 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.538794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2zd\" (UniqueName: \"kubernetes.io/projected/045826c8-ee95-494f-8d06-d8d18b2717ca-kube-api-access-8v2zd\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.539019 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.538821 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.539019 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.538908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9f7v\" (UniqueName: \"kubernetes.io/projected/deb31f3a-b5fe-4b80-a8dd-2ef625183254-kube-api-access-j9f7v\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:32.539019 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.538944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:32.539019 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.539001 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/045826c8-ee95-494f-8d06-d8d18b2717ca-config-volume\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.640219 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640177 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9f7v\" (UniqueName: \"kubernetes.io/projected/deb31f3a-b5fe-4b80-a8dd-2ef625183254-kube-api-access-j9f7v\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:32.640399 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:32.640399 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/045826c8-ee95-494f-8d06-d8d18b2717ca-config-volume\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.640481 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640413 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045826c8-ee95-494f-8d06-d8d18b2717ca-tmp-dir\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.640481 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640429 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2zd\" (UniqueName: \"kubernetes.io/projected/045826c8-ee95-494f-8d06-d8d18b2717ca-kube-api-access-8v2zd\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.640481 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.640580 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:32.640547 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:32.640638 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:32.640599 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.140582408 +0000 UTC m=+32.581110841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:32.641019 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.640994 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045826c8-ee95-494f-8d06-d8d18b2717ca-tmp-dir\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.641118 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.641038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/045826c8-ee95-494f-8d06-d8d18b2717ca-config-volume\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.641169 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:32.641127 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:32.641240 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:32.641184 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.141167782 +0000 UTC m=+32.581696215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:32.653176 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.653144 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2zd\" (UniqueName: \"kubernetes.io/projected/045826c8-ee95-494f-8d06-d8d18b2717ca-kube-api-access-8v2zd\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:32.653325 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:32.653196 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9f7v\" (UniqueName: \"kubernetes.io/projected/deb31f3a-b5fe-4b80-a8dd-2ef625183254-kube-api-access-j9f7v\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:33.143976 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.143930 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:33.144140 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.144000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:33.144140 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.144105 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:33.144140 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.144109 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:33.144273 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.144172 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:34.144153903 +0000 UTC m=+33.584682347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:33.144273 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.144189 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:03:34.14417984 +0000 UTC m=+33.584708272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:33.199683 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.199643 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:33.199683 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.199677 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:33.203646 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.203604 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:33.203646 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.203647 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x7n6v\"" Apr 16 16:03:33.203842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.203609 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:33.203842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.203649 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:03:33.203842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.203760 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:33.848413 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.848380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:03:33.848974 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.848509 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:03:33.848974 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:33.848598 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:04:05.848575474 +0000 UTC m=+65.289103917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : secret "metrics-daemon-secret" not found Apr 16 16:03:33.949400 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.949355 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:33.952302 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:33.952274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgxx\" (UniqueName: \"kubernetes.io/projected/99da9992-0d67-494e-853e-a94744056361-kube-api-access-zmgxx\") pod \"network-check-target-gvxkh\" (UID: \"99da9992-0d67-494e-853e-a94744056361\") " pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:34.118604 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:34.118532 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:34.150641 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:34.150598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:34.150808 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:34.150657 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:34.150808 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:34.150729 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:34.150808 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:34.150747 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:34.150808 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:34.150793 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:03:36.150776637 +0000 UTC m=+35.591305087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:34.150808 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:34.150808 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:36.150802231 +0000 UTC m=+35.591330663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:34.273780 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:34.273752 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gvxkh"] Apr 16 16:03:34.386095 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:34.386057 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99da9992_0d67_494e_853e_a94744056361.slice/crio-496ac42c4dba8e1f47822734863895c893332643b92a8f290e2c568d56bee0f8 WatchSource:0}: Error finding container 496ac42c4dba8e1f47822734863895c893332643b92a8f290e2c568d56bee0f8: Status 404 returned error can't find the container with id 496ac42c4dba8e1f47822734863895c893332643b92a8f290e2c568d56bee0f8 Apr 16 16:03:34.472047 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:34.472013 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gvxkh" event={"ID":"99da9992-0d67-494e-853e-a94744056361","Type":"ContainerStarted","Data":"496ac42c4dba8e1f47822734863895c893332643b92a8f290e2c568d56bee0f8"} Apr 16 16:03:35.480185 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:35.479970 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="73c4d0298b49b096afa77169f2b8a34a9cb34fa6a5297592bf1032d819db008f" exitCode=0 Apr 16 16:03:35.480185 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:35.480053 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"73c4d0298b49b096afa77169f2b8a34a9cb34fa6a5297592bf1032d819db008f"} Apr 16 16:03:36.164346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:36.164315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:36.164530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:36.164356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:36.164530 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:36.164450 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:36.164530 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:36.164458 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:36.164530 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:36.164500 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:40.164483357 +0000 UTC m=+39.605011794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:36.164530 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:36.164515 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:03:40.16450713 +0000 UTC m=+39.605035563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:36.486129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:36.486044 2566 generic.go:358] "Generic (PLEG): container finished" podID="eb39d18c-d897-41cd-b539-6c31f7f376e3" containerID="04879c0f15f6648b6c55c902053d87404f947ec7b04de9d66d85b73a093afb5c" exitCode=0 Apr 16 16:03:36.486129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:36.486102 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerDied","Data":"04879c0f15f6648b6c55c902053d87404f947ec7b04de9d66d85b73a093afb5c"} Apr 16 16:03:37.491583 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:37.491553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" event={"ID":"eb39d18c-d897-41cd-b539-6c31f7f376e3","Type":"ContainerStarted","Data":"cdb5913c98a610928999878e2bbe1a6f3516dbe4ce192cc38774e0a4c796d598"} Apr 16 16:03:37.519333 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:37.519236 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pmzx9" podStartSLOduration=4.603054649 podStartE2EDuration="36.519220728s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:02.506912173 +0000 UTC m=+1.947440608" lastFinishedPulling="2026-04-16 16:03:34.423078249 +0000 UTC m=+33.863606687" observedRunningTime="2026-04-16 16:03:37.517581327 +0000 UTC m=+36.958109782" watchObservedRunningTime="2026-04-16 16:03:37.519220728 +0000 UTC m=+36.959749182" Apr 16 16:03:38.494559 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:38.494514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gvxkh" event={"ID":"99da9992-0d67-494e-853e-a94744056361","Type":"ContainerStarted","Data":"e32ab2eb518be17410e05f5e2125569afcc1284e7b54c8fcd0430d8201f58c84"} Apr 16 16:03:38.495041 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:38.494895 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:03:38.514710 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:38.512544 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gvxkh" podStartSLOduration=34.514547927 podStartE2EDuration="37.512525114s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:34.399321773 +0000 UTC m=+33.839850206" lastFinishedPulling="2026-04-16 16:03:37.397298945 +0000 UTC m=+36.837827393" observedRunningTime="2026-04-16 16:03:38.510784059 +0000 UTC m=+37.951312515" watchObservedRunningTime="2026-04-16 16:03:38.512525114 +0000 UTC m=+37.953053581" Apr 16 16:03:40.192412 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:40.192374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:40.192828 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:40.192440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:40.192828 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:40.192522 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:40.192828 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:40.192524 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:40.192828 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:40.192582 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:03:48.192568101 +0000 UTC m=+47.633096539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:40.192828 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:40.192595 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:48.192589318 +0000 UTC m=+47.633117751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:48.243179 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:48.243141 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:03:48.243179 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:48.243188 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:03:48.243677 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:48.243304 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:48.243677 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:48.243370 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:04.243356008 +0000 UTC m=+63.683884447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:03:48.243677 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:48.243304 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:48.243677 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:03:48.243448 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:04:04.243435846 +0000 UTC m=+63.683964286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:03:53.950877 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.950842 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r"] Apr 16 16:03:53.955612 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.955594 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:53.960270 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.960247 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:03:53.960393 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.960249 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4v7gg\"" Apr 16 16:03:53.960393 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.960250 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:03:53.961063 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.961044 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:03:53.961138 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.961061 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:03:53.965220 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:53.965199 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r"] Apr 16 16:03:54.084187 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.084146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.084352 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.084203 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9v5x\" (UniqueName: \"kubernetes.io/projected/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-kube-api-access-g9v5x\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.184574 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.184527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.184691 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.184604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9v5x\" (UniqueName: \"kubernetes.io/projected/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-kube-api-access-g9v5x\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.189236 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.189210 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.194083 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.194058 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9v5x\" (UniqueName: \"kubernetes.io/projected/eff58eab-eb6b-458d-bdfd-c30bc967ef8c-kube-api-access-g9v5x\") pod \"managed-serviceaccount-addon-agent-7c6d54cf79-ff67r\" (UID: \"eff58eab-eb6b-458d-bdfd-c30bc967ef8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.276211 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.276103 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" Apr 16 16:03:54.388482 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.388452 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r"] Apr 16 16:03:54.391426 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:03:54.391396 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff58eab_eb6b_458d_bdfd_c30bc967ef8c.slice/crio-1f5d5c07400c275ef67ba1405289d506f41410897cc87f713a744cc9235a55e0 WatchSource:0}: Error finding container 1f5d5c07400c275ef67ba1405289d506f41410897cc87f713a744cc9235a55e0: Status 404 returned error can't find the container with id 1f5d5c07400c275ef67ba1405289d506f41410897cc87f713a744cc9235a55e0 Apr 16 16:03:54.525260 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:54.525226 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" event={"ID":"eff58eab-eb6b-458d-bdfd-c30bc967ef8c","Type":"ContainerStarted","Data":"1f5d5c07400c275ef67ba1405289d506f41410897cc87f713a744cc9235a55e0"} Apr 16 16:03:57.532648 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:57.532597 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" event={"ID":"eff58eab-eb6b-458d-bdfd-c30bc967ef8c","Type":"ContainerStarted","Data":"b91ce3951b2abe3f1bd445763b3150bfe92d95e0cc60d3208c29bd3ba7cb22db"} Apr 16 16:03:57.549732 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:57.549688 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" podStartSLOduration=2.435930779 podStartE2EDuration="4.549675278s" podCreationTimestamp="2026-04-16 16:03:53 +0000 UTC" firstStartedPulling="2026-04-16 16:03:54.393247343 +0000 UTC m=+53.833775776" lastFinishedPulling="2026-04-16 16:03:56.506991842 +0000 UTC m=+55.947520275" observedRunningTime="2026-04-16 16:03:57.548684186 +0000 UTC m=+56.989212654" watchObservedRunningTime="2026-04-16 16:03:57.549675278 +0000 UTC m=+56.990203732" Apr 16 16:03:59.470508 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:03:59.470483 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkhjf" Apr 16 16:04:04.254022 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:04.253975 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:04:04.254022 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:04.254028 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:04:04.254431 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:04.254124 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:04.254431 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:04.254129 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:04.254431 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:04.254173 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:36.254160124 +0000 UTC m=+95.694688563 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:04:04.254431 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:04.254187 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:04:36.254179914 +0000 UTC m=+95.694708346 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:04:05.863591 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:05.863534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:04:05.863981 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:05.863695 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:04:05.863981 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:05.863756 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:05:09.863739019 +0000 UTC m=+129.304267457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : secret "metrics-daemon-secret" not found Apr 16 16:04:10.500633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:10.500516 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gvxkh" Apr 16 16:04:36.268812 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:36.268775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:04:36.269364 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:04:36.268874 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:04:36.269364 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:36.269008 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:36.269364 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:36.269053 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:36.269364 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:36.269087 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert podName:deb31f3a-b5fe-4b80-a8dd-2ef625183254 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:40.269066589 +0000 UTC m=+159.709595024 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert") pod "ingress-canary-pxsb9" (UID: "deb31f3a-b5fe-4b80-a8dd-2ef625183254") : secret "canary-serving-cert" not found Apr 16 16:04:36.269364 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:04:36.269104 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls podName:045826c8-ee95-494f-8d06-d8d18b2717ca nodeName:}" failed. No retries permitted until 2026-04-16 16:05:40.26909644 +0000 UTC m=+159.709624873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls") pod "dns-default-fm66n" (UID: "045826c8-ee95-494f-8d06-d8d18b2717ca") : secret "dns-default-metrics-tls" not found Apr 16 16:05:09.895051 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:09.895012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:05:09.895506 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:09.895119 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:05:09.895506 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:09.895181 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs podName:3e5c15b5-f8c7-478b-a327-14aad8952c3f nodeName:}" failed. No retries permitted until 2026-04-16 16:07:11.895167028 +0000 UTC m=+251.335695461 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs") pod "network-metrics-daemon-4kkrg" (UID: "3e5c15b5-f8c7-478b-a327-14aad8952c3f") : secret "metrics-daemon-secret" not found Apr 16 16:05:17.890950 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.890915 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:05:17.893267 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.893250 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.895790 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.895765 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:05:17.895911 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.895881 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:05:17.896332 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.896312 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-568f646654-h7nqt"] Apr 16 16:05:17.896418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.896317 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:05:17.896522 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.896502 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6298j\"" Apr 16 16:05:17.898463 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.898446 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.901505 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.901489 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:05:17.903498 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.903478 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.903709 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.903693 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:05:17.903852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.903837 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:05:17.903940 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.903910 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:05:17.903940 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.903929 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-29lw4\"" Apr 16 16:05:17.904071 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.904057 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:05:17.904149 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.904136 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.909334 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.909316 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:05:17.912395 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.912375 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-568f646654-h7nqt"] Apr 16 16:05:17.947963 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.947938 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.948099 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.947967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.948099 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948099 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948024 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-default-certificate\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.948099 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nw4\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948223 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948259 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7wsk\" (UniqueName: \"kubernetes.io/projected/403c812f-312a-4792-b1da-b54e362000a8-kube-api-access-q7wsk\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.948488 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948307 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-stats-auth\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:17.948488 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948333 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.948488 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.948370 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:17.987104 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.987077 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx"] Apr 16 16:05:17.989166 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.989151 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" Apr 16 16:05:17.990820 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.990794 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb"] Apr 16 16:05:17.991754 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.991736 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-w6kh4\"" Apr 16 16:05:17.992790 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.992775 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:17.996076 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.996057 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7"] Apr 16 16:05:17.996260 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.996238 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:05:17.996429 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.996401 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:05:17.996496 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.996467 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4dmkt\"" Apr 16 16:05:17.998221 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:17.998205 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.000891 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.000872 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:05:18.000891 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.000889 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:05:18.001213 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.001197 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:18.001647 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.001632 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:18.001708 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.001677 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-7wfh2\"" Apr 16 16:05:18.011395 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.011372 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx"] Apr 16 16:05:18.012141 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.012123 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7"] Apr 16 16:05:18.025530 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.025505 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb"] Apr 16 16:05:18.048692 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-default-certificate\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048696 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048725 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048742 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nw4\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048777 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.048867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048800 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048875 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6nf\" (UniqueName: \"kubernetes.io/projected/eedc515d-9a62-481f-9dbd-8fe29169a6b1-kube-api-access-bc6nf\") pod \"network-check-source-7b678d77c7-59cwx\" (UID: \"eedc515d-9a62-481f-9dbd-8fe29169a6b1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048901 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99110ea-ed08-4c60-a0df-4d2bbde55794-config\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048932 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7wsk\" (UniqueName: \"kubernetes.io/projected/403c812f-312a-4792-b1da-b54e362000a8-kube-api-access-q7wsk\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-stats-auth\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.048995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049049 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d99110ea-ed08-4c60-a0df-4d2bbde55794-serving-cert\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.049175 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049154 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049181 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/745be784-393b-4faa-b213-9d593611899a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqfk\" (UniqueName: \"kubernetes.io/projected/d99110ea-ed08-4c60-a0df-4d2bbde55794-kube-api-access-fsqfk\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049228 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049238 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049300 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.54927981 +0000 UTC m=+137.989808250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : secret "router-metrics-certs-default" not found Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049463 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.549446624 +0000 UTC m=+137.989975068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : configmap references non-existent config key: service-ca.crt Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049539 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049554 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665647ff5f-shgzl: secret "image-registry-tls" not found Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.049591 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls podName:9c6643ce-819b-493d-a650-95da3e927c4e nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.549578286 +0000 UTC m=+137.990106722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls") pod "image-registry-665647ff5f-shgzl" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e") : secret "image-registry-tls" not found Apr 16 16:05:18.049724 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.049604 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.050212 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.050107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.051506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.051479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-stats-auth\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.051654 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.051636 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.051704 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.051691 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-default-certificate\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.051918 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.051897 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.059703 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.059679 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7wsk\" (UniqueName: \"kubernetes.io/projected/403c812f-312a-4792-b1da-b54e362000a8-kube-api-access-q7wsk\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.060447 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.060426 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.060863 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.060844 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nw4\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.149867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d99110ea-ed08-4c60-a0df-4d2bbde55794-serving-cert\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.149867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/745be784-393b-4faa-b213-9d593611899a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.149867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqfk\" (UniqueName: \"kubernetes.io/projected/d99110ea-ed08-4c60-a0df-4d2bbde55794-kube-api-access-fsqfk\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.150139 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.150139 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149911 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6nf\" (UniqueName: \"kubernetes.io/projected/eedc515d-9a62-481f-9dbd-8fe29169a6b1-kube-api-access-bc6nf\") pod \"network-check-source-7b678d77c7-59cwx\" (UID: \"eedc515d-9a62-481f-9dbd-8fe29169a6b1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" Apr 16 16:05:18.150139 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.149926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99110ea-ed08-4c60-a0df-4d2bbde55794-config\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.150139 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.150008 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:18.150139 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.150084 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.650063344 +0000 UTC m=+138.090591794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:18.150466 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.150449 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99110ea-ed08-4c60-a0df-4d2bbde55794-config\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.150511 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.150494 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/745be784-393b-4faa-b213-9d593611899a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.152105 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.152080 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d99110ea-ed08-4c60-a0df-4d2bbde55794-serving-cert\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.159457 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.159437 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6nf\" (UniqueName: \"kubernetes.io/projected/eedc515d-9a62-481f-9dbd-8fe29169a6b1-kube-api-access-bc6nf\") pod \"network-check-source-7b678d77c7-59cwx\" (UID: \"eedc515d-9a62-481f-9dbd-8fe29169a6b1\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" Apr 16 16:05:18.159641 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.159626 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqfk\" (UniqueName: \"kubernetes.io/projected/d99110ea-ed08-4c60-a0df-4d2bbde55794-kube-api-access-fsqfk\") pod \"service-ca-operator-69965bb79d-j6lc7\" (UID: \"d99110ea-ed08-4c60-a0df-4d2bbde55794\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.299100 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.299066 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" Apr 16 16:05:18.310374 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.310347 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" Apr 16 16:05:18.419394 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.419242 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx"] Apr 16 16:05:18.421651 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:18.421608 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedc515d_9a62_481f_9dbd_8fe29169a6b1.slice/crio-a4f227fec8babba8aae9609d1a94ec2b01939698cdefee8a83a24b2de7c8243f WatchSource:0}: Error finding container a4f227fec8babba8aae9609d1a94ec2b01939698cdefee8a83a24b2de7c8243f: Status 404 returned error can't find the container with id a4f227fec8babba8aae9609d1a94ec2b01939698cdefee8a83a24b2de7c8243f Apr 16 16:05:18.435989 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.435907 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7"] Apr 16 16:05:18.438334 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:18.438302 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd99110ea_ed08_4c60_a0df_4d2bbde55794.slice/crio-b72aa13f1468352cf8658e208296140ff47747253a7b713657a3155d95cc3086 WatchSource:0}: Error finding container b72aa13f1468352cf8658e208296140ff47747253a7b713657a3155d95cc3086: Status 404 returned error can't find the container with id b72aa13f1468352cf8658e208296140ff47747253a7b713657a3155d95cc3086 Apr 16 16:05:18.552484 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.552453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.552484 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.552490 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.552527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552647 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.552606992 +0000 UTC m=+138.993135439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : configmap references non-existent config key: service-ca.crt Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552655 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552683 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552693 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665647ff5f-shgzl: secret "image-registry-tls" not found Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552720 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.552701771 +0000 UTC m=+138.993230211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : secret "router-metrics-certs-default" not found Apr 16 16:05:18.552758 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.552739 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls podName:9c6643ce-819b-493d-a650-95da3e927c4e nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.552730159 +0000 UTC m=+138.993258596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls") pod "image-registry-665647ff5f-shgzl" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e") : secret "image-registry-tls" not found Apr 16 16:05:18.653061 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.653021 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:18.653217 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.653153 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:18.653253 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:18.653229 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.653213152 +0000 UTC m=+139.093741589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:18.689541 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.689442 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" event={"ID":"d99110ea-ed08-4c60-a0df-4d2bbde55794","Type":"ContainerStarted","Data":"b72aa13f1468352cf8658e208296140ff47747253a7b713657a3155d95cc3086"} Apr 16 16:05:18.690715 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.690691 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" event={"ID":"eedc515d-9a62-481f-9dbd-8fe29169a6b1","Type":"ContainerStarted","Data":"8fa4483ab8c51d7f75200589a0b41a753bd9a978f314170c2374df95e30d1d93"} Apr 16 16:05:18.690827 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.690719 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" event={"ID":"eedc515d-9a62-481f-9dbd-8fe29169a6b1","Type":"ContainerStarted","Data":"a4f227fec8babba8aae9609d1a94ec2b01939698cdefee8a83a24b2de7c8243f"} Apr 16 16:05:18.710214 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:18.710160 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-59cwx" podStartSLOduration=1.71014456 podStartE2EDuration="1.71014456s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:18.708806136 +0000 UTC m=+138.149334604" watchObservedRunningTime="2026-04-16 16:05:18.71014456 +0000 UTC m=+138.150673016" Apr 16 16:05:19.560633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:19.560580 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:19.560637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:19.560681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560742 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.560723789 +0000 UTC m=+141.001252225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : configmap references non-existent config key: service-ca.crt Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560793 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560808 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665647ff5f-shgzl: secret "image-registry-tls" not found Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560806 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560862 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.560845423 +0000 UTC m=+141.001373869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : secret "router-metrics-certs-default" not found Apr 16 16:05:19.561011 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.560881 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls podName:9c6643ce-819b-493d-a650-95da3e927c4e nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.560872103 +0000 UTC m=+141.001400544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls") pod "image-registry-665647ff5f-shgzl" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e") : secret "image-registry-tls" not found Apr 16 16:05:19.661749 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:19.661699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:19.661932 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.661860 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:19.661996 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:19.661944 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.661918095 +0000 UTC m=+141.102446533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:20.696506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:20.696471 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" event={"ID":"d99110ea-ed08-4c60-a0df-4d2bbde55794","Type":"ContainerStarted","Data":"769f560637c0e369f06c6a71973411952c9503fa3fcb5be0f864e2f00fbac8d6"} Apr 16 16:05:20.714429 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:20.714382 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" podStartSLOduration=1.597371823 podStartE2EDuration="3.714366908s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:18.439938813 +0000 UTC m=+137.880467247" lastFinishedPulling="2026-04-16 16:05:20.5569339 +0000 UTC m=+139.997462332" observedRunningTime="2026-04-16 16:05:20.713456688 +0000 UTC m=+140.153985154" watchObservedRunningTime="2026-04-16 16:05:20.714366908 +0000 UTC m=+140.154895362" Apr 16 16:05:21.576197 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:21.576160 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:21.576197 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:21.576202 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:21.576235 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576346 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576360 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665647ff5f-shgzl: secret "image-registry-tls" not found Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576374 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.576352248 +0000 UTC m=+145.016880698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : configmap references non-existent config key: service-ca.crt Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576407 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls podName:9c6643ce-819b-493d-a650-95da3e927c4e nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.576396226 +0000 UTC m=+145.016924658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls") pod "image-registry-665647ff5f-shgzl" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e") : secret "image-registry-tls" not found Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576344 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:05:21.576468 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.576447 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.576438546 +0000 UTC m=+145.016966982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : secret "router-metrics-certs-default" not found Apr 16 16:05:21.677325 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:21.677289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:21.677483 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.677432 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:21.677527 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:21.677498 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.677484227 +0000 UTC m=+145.118012664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:25.049876 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.049853 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjkzm_b54e0816-7a3e-49eb-bc50-eebcbb3a03c2/dns-node-resolver/0.log" Apr 16 16:05:25.608633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.608576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:25.608633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.608634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:25.608865 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.608673 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:25.608865 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608745 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.608727594 +0000 UTC m=+153.049256032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : configmap references non-existent config key: service-ca.crt Apr 16 16:05:25.608865 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608798 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:05:25.608865 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608846 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs podName:403c812f-312a-4792-b1da-b54e362000a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.608834781 +0000 UTC m=+153.049363218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs") pod "router-default-568f646654-h7nqt" (UID: "403c812f-312a-4792-b1da-b54e362000a8") : secret "router-metrics-certs-default" not found Apr 16 16:05:25.608865 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608800 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:25.609047 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608871 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665647ff5f-shgzl: secret "image-registry-tls" not found Apr 16 16:05:25.609047 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.608914 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls podName:9c6643ce-819b-493d-a650-95da3e927c4e nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.608903615 +0000 UTC m=+153.049432052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls") pod "image-registry-665647ff5f-shgzl" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e") : secret "image-registry-tls" not found Apr 16 16:05:25.709823 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.709793 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:25.709997 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.709892 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:25.709997 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:25.709937 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.709923826 +0000 UTC m=+153.150452266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:25.852468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:25.852421 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kl4jm_19e9bbf4-2f93-4060-aa4f-2838412f8254/node-ca/0.log" Apr 16 16:05:33.670538 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.670479 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:33.670538 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.670536 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:33.670991 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.670564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:33.671677 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.671658 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/403c812f-312a-4792-b1da-b54e362000a8-service-ca-bundle\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:33.672921 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.672897 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/403c812f-312a-4792-b1da-b54e362000a8-metrics-certs\") pod \"router-default-568f646654-h7nqt\" (UID: \"403c812f-312a-4792-b1da-b54e362000a8\") " pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:33.672921 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.672911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"image-registry-665647ff5f-shgzl\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:33.771694 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.771652 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:33.771887 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:33.771792 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:05:33.771887 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:33.771863 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert podName:745be784-393b-4faa-b213-9d593611899a nodeName:}" failed. No retries permitted until 2026-04-16 16:05:49.771847532 +0000 UTC m=+169.212375969 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-gnvkb" (UID: "745be784-393b-4faa-b213-9d593611899a") : secret "networking-console-plugin-cert" not found Apr 16 16:05:33.802342 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.802305 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:33.808153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.808114 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:33.932177 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.932155 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:05:33.934888 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:33.934866 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c6643ce_819b_493d_a650_95da3e927c4e.slice/crio-3b05f54f0b02ba559be70b229b39bdbc6c9fc6b524e533f4920e8ec2a170637a WatchSource:0}: Error finding container 3b05f54f0b02ba559be70b229b39bdbc6c9fc6b524e533f4920e8ec2a170637a: Status 404 returned error can't find the container with id 3b05f54f0b02ba559be70b229b39bdbc6c9fc6b524e533f4920e8ec2a170637a Apr 16 16:05:33.946676 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:33.946649 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-568f646654-h7nqt"] Apr 16 16:05:33.949450 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:33.949427 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403c812f_312a_4792_b1da_b54e362000a8.slice/crio-6ba41d4f50c55d3cca978968b6d44e08ab451246ffd1a669a825c4e5441698d6 WatchSource:0}: Error finding container 6ba41d4f50c55d3cca978968b6d44e08ab451246ffd1a669a825c4e5441698d6: Status 404 returned error can't find the container with id 6ba41d4f50c55d3cca978968b6d44e08ab451246ffd1a669a825c4e5441698d6 Apr 16 16:05:34.725560 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.725513 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-568f646654-h7nqt" event={"ID":"403c812f-312a-4792-b1da-b54e362000a8","Type":"ContainerStarted","Data":"64c61296ca9ffe3ac17a91dfadcc97e4fada0dfe8e231657c51cda33bbd69883"} Apr 16 16:05:34.725560 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.725554 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-568f646654-h7nqt" event={"ID":"403c812f-312a-4792-b1da-b54e362000a8","Type":"ContainerStarted","Data":"6ba41d4f50c55d3cca978968b6d44e08ab451246ffd1a669a825c4e5441698d6"} Apr 16 16:05:34.726874 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.726840 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" event={"ID":"9c6643ce-819b-493d-a650-95da3e927c4e","Type":"ContainerStarted","Data":"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770"} Apr 16 16:05:34.726874 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.726871 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" event={"ID":"9c6643ce-819b-493d-a650-95da3e927c4e","Type":"ContainerStarted","Data":"3b05f54f0b02ba559be70b229b39bdbc6c9fc6b524e533f4920e8ec2a170637a"} Apr 16 16:05:34.727005 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.726985 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:05:34.746203 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.746159 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-568f646654-h7nqt" podStartSLOduration=17.746146045 podStartE2EDuration="17.746146045s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:34.744997507 +0000 UTC m=+154.185525965" watchObservedRunningTime="2026-04-16 16:05:34.746146045 +0000 UTC m=+154.186674496" Apr 16 16:05:34.766045 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.766002 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" podStartSLOduration=17.765987445 podStartE2EDuration="17.765987445s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:34.765458589 +0000 UTC m=+154.205987082" watchObservedRunningTime="2026-04-16 16:05:34.765987445 +0000 UTC m=+154.206515900" Apr 16 16:05:34.808421 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.808388 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:34.810996 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:34.810966 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:35.458756 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:35.458701 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fm66n" podUID="045826c8-ee95-494f-8d06-d8d18b2717ca" Apr 16 16:05:35.478891 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:35.478855 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pxsb9" podUID="deb31f3a-b5fe-4b80-a8dd-2ef625183254" Apr 16 16:05:35.729350 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:35.729267 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:35.729730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:35.729548 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:05:35.729816 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:35.729802 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:35.730787 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:35.730768 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-568f646654-h7nqt" Apr 16 16:05:36.213261 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:05:36.213223 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4kkrg" podUID="3e5c15b5-f8c7-478b-a327-14aad8952c3f" Apr 16 16:05:40.324770 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.324677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:05:40.325141 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.324807 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:40.327174 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.327154 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb31f3a-b5fe-4b80-a8dd-2ef625183254-cert\") pod \"ingress-canary-pxsb9\" (UID: \"deb31f3a-b5fe-4b80-a8dd-2ef625183254\") " pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:05:40.327219 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.327163 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/045826c8-ee95-494f-8d06-d8d18b2717ca-metrics-tls\") pod \"dns-default-fm66n\" (UID: \"045826c8-ee95-494f-8d06-d8d18b2717ca\") " pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:40.533671 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.533636 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:05:40.534817 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.534800 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:05:40.540908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.540889 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pxsb9" Apr 16 16:05:40.540953 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.540908 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:40.676670 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.676552 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fm66n"] Apr 16 16:05:40.679833 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:40.679805 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045826c8_ee95_494f_8d06_d8d18b2717ca.slice/crio-fa4d06e6307e41193d1a6a4207f88bc7ddf65277f0acb8d292c130cfaf74e216 WatchSource:0}: Error finding container fa4d06e6307e41193d1a6a4207f88bc7ddf65277f0acb8d292c130cfaf74e216: Status 404 returned error can't find the container with id fa4d06e6307e41193d1a6a4207f88bc7ddf65277f0acb8d292c130cfaf74e216 Apr 16 16:05:40.689575 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.689500 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pxsb9"] Apr 16 16:05:40.691249 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:40.691225 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb31f3a_b5fe_4b80_a8dd_2ef625183254.slice/crio-df6d5809d6c94ab3f8fdc0b7ce1a1d72a7829908020488e5ac8af72d71616e7b WatchSource:0}: Error finding container df6d5809d6c94ab3f8fdc0b7ce1a1d72a7829908020488e5ac8af72d71616e7b: Status 404 returned error can't find the container with id df6d5809d6c94ab3f8fdc0b7ce1a1d72a7829908020488e5ac8af72d71616e7b Apr 16 16:05:40.741719 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.741685 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fm66n" event={"ID":"045826c8-ee95-494f-8d06-d8d18b2717ca","Type":"ContainerStarted","Data":"fa4d06e6307e41193d1a6a4207f88bc7ddf65277f0acb8d292c130cfaf74e216"} Apr 16 16:05:40.742528 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:40.742505 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pxsb9" event={"ID":"deb31f3a-b5fe-4b80-a8dd-2ef625183254","Type":"ContainerStarted","Data":"df6d5809d6c94ab3f8fdc0b7ce1a1d72a7829908020488e5ac8af72d71616e7b"} Apr 16 16:05:42.749296 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.749203 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pxsb9" event={"ID":"deb31f3a-b5fe-4b80-a8dd-2ef625183254","Type":"ContainerStarted","Data":"865413cf97bb776fee63a90cc4edc33da177a20900be1e97a9be9360db5b31e7"} Apr 16 16:05:42.750807 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.750779 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fm66n" event={"ID":"045826c8-ee95-494f-8d06-d8d18b2717ca","Type":"ContainerStarted","Data":"cc19d5ec9f817d5e37e6df3b2ee7391394219a9072721f482145017f70531c9f"} Apr 16 16:05:42.750908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.750812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fm66n" event={"ID":"045826c8-ee95-494f-8d06-d8d18b2717ca","Type":"ContainerStarted","Data":"f1518e02935babb5f1f5abe4606dd028588ab061e7d75ea92d540fa6f557a29f"} Apr 16 16:05:42.750908 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.750890 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:42.766308 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.766263 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pxsb9" podStartSLOduration=129.028466333 podStartE2EDuration="2m10.766251576s" podCreationTimestamp="2026-04-16 16:03:32 +0000 UTC" firstStartedPulling="2026-04-16 16:05:40.69298741 +0000 UTC m=+160.133515847" lastFinishedPulling="2026-04-16 16:05:42.430772653 +0000 UTC m=+161.871301090" observedRunningTime="2026-04-16 16:05:42.765455628 +0000 UTC m=+162.205984085" watchObservedRunningTime="2026-04-16 16:05:42.766251576 +0000 UTC m=+162.206780026" Apr 16 16:05:42.784304 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:42.784265 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fm66n" podStartSLOduration=129.038483189 podStartE2EDuration="2m10.784255447s" podCreationTimestamp="2026-04-16 16:03:32 +0000 UTC" firstStartedPulling="2026-04-16 16:05:40.68171064 +0000 UTC m=+160.122239079" lastFinishedPulling="2026-04-16 16:05:42.427482901 +0000 UTC m=+161.868011337" observedRunningTime="2026-04-16 16:05:42.78293715 +0000 UTC m=+162.223465605" watchObservedRunningTime="2026-04-16 16:05:42.784255447 +0000 UTC m=+162.224783902" Apr 16 16:05:48.199250 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.199202 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:05:48.556732 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.556654 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:05:48.562304 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.562269 2566 patch_prober.go:28] interesting pod/image-registry-665647ff5f-shgzl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:05:48.562468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.562324 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:05:48.586469 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.585970 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vdrp4"] Apr 16 16:05:48.590695 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.589992 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.597498 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.597452 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-p4mnm\"" Apr 16 16:05:48.597498 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.597477 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:05:48.597993 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.597974 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:05:48.598106 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.597974 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:05:48.598106 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.598045 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:05:48.618871 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.618842 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vdrp4"] Apr 16 16:05:48.684537 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.684501 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6cabfaf7-a334-4298-988b-7cb9b7e35302-data-volume\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.684537 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.684539 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6cabfaf7-a334-4298-988b-7cb9b7e35302-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.684740 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.684575 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.684740 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.684650 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkhf\" (UniqueName: \"kubernetes.io/projected/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-api-access-7bkhf\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.684740 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.684679 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6cabfaf7-a334-4298-988b-7cb9b7e35302-crio-socket\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.688041 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.688020 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77c9bfcb4d-8wlph"] Apr 16 16:05:48.690489 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.690475 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.711953 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.711932 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77c9bfcb4d-8wlph"] Apr 16 16:05:48.785725 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785692 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6cabfaf7-a334-4298-988b-7cb9b7e35302-crio-socket\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785734 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-installation-pull-secrets\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-image-registry-private-configuration\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785809 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-tls\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785815 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6cabfaf7-a334-4298-988b-7cb9b7e35302-crio-socket\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785831 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfs9x\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-kube-api-access-nfs9x\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.785901 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-trusted-ca\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6cabfaf7-a334-4298-988b-7cb9b7e35302-data-volume\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785971 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-bound-sa-token\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.785993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6cabfaf7-a334-4298-988b-7cb9b7e35302-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786035 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786053 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-ca-trust-extracted\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-certificates\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.786153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786089 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkhf\" (UniqueName: \"kubernetes.io/projected/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-api-access-7bkhf\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.786457 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6cabfaf7-a334-4298-988b-7cb9b7e35302-data-volume\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.786551 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.786534 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.788339 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.788324 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6cabfaf7-a334-4298-988b-7cb9b7e35302-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.795208 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.795188 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkhf\" (UniqueName: \"kubernetes.io/projected/6cabfaf7-a334-4298-988b-7cb9b7e35302-kube-api-access-7bkhf\") pod \"insights-runtime-extractor-vdrp4\" (UID: \"6cabfaf7-a334-4298-988b-7cb9b7e35302\") " pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.886897 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.886865 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-installation-pull-secrets\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887088 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.886920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-image-registry-private-configuration\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887088 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.886969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-tls\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887088 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.886995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfs9x\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-kube-api-access-nfs9x\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887088 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.887021 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-trusted-ca\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887088 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.887050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-bound-sa-token\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887341 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.887107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-ca-trust-extracted\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887341 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.887134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-certificates\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.887574 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.887524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-ca-trust-extracted\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.888048 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.888025 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-certificates\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.888157 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.888136 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-trusted-ca\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.889404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.889378 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-image-registry-private-configuration\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.889514 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.889498 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-registry-tls\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.889596 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.889577 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-installation-pull-secrets\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.896783 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.896758 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-bound-sa-token\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.897160 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.897138 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfs9x\" (UniqueName: \"kubernetes.io/projected/9ea9ebe0-ee87-4861-b4c4-b7866eef641d-kube-api-access-nfs9x\") pod \"image-registry-77c9bfcb4d-8wlph\" (UID: \"9ea9ebe0-ee87-4861-b4c4-b7866eef641d\") " pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:48.905669 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.905652 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vdrp4" Apr 16 16:05:48.999331 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:48.998847 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:49.055296 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.055270 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vdrp4"] Apr 16 16:05:49.060157 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:49.060124 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cabfaf7_a334_4298_988b_7cb9b7e35302.slice/crio-7ab716c057a86854653dee305fe8a33018476ab4bf4b5237ca019bd44c9aee06 WatchSource:0}: Error finding container 7ab716c057a86854653dee305fe8a33018476ab4bf4b5237ca019bd44c9aee06: Status 404 returned error can't find the container with id 7ab716c057a86854653dee305fe8a33018476ab4bf4b5237ca019bd44c9aee06 Apr 16 16:05:49.144007 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.143932 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77c9bfcb4d-8wlph"] Apr 16 16:05:49.147059 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:49.147035 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea9ebe0_ee87_4861_b4c4_b7866eef641d.slice/crio-5ea676a2a57ab2aa8278bd7a9241c062824d165e8cd3c2ab59bcd3dc286f4b70 WatchSource:0}: Error finding container 5ea676a2a57ab2aa8278bd7a9241c062824d165e8cd3c2ab59bcd3dc286f4b70: Status 404 returned error can't find the container with id 5ea676a2a57ab2aa8278bd7a9241c062824d165e8cd3c2ab59bcd3dc286f4b70 Apr 16 16:05:49.768982 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.768950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdrp4" event={"ID":"6cabfaf7-a334-4298-988b-7cb9b7e35302","Type":"ContainerStarted","Data":"800d08f3d75f3a3a6358452447a83763cd98ac45448e764f6da82812465a452c"} Apr 16 16:05:49.768982 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.768986 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdrp4" event={"ID":"6cabfaf7-a334-4298-988b-7cb9b7e35302","Type":"ContainerStarted","Data":"8115653ec8ebeb440144cd78498b753d65fcf2d6a59f541847882f2718ec46ce"} Apr 16 16:05:49.769392 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.769000 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdrp4" event={"ID":"6cabfaf7-a334-4298-988b-7cb9b7e35302","Type":"ContainerStarted","Data":"7ab716c057a86854653dee305fe8a33018476ab4bf4b5237ca019bd44c9aee06"} Apr 16 16:05:49.770191 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.770160 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" event={"ID":"9ea9ebe0-ee87-4861-b4c4-b7866eef641d","Type":"ContainerStarted","Data":"e07570be24ee3df26a734ed26751cc24e8e99cfe53b40088a9dd931aaa1d438a"} Apr 16 16:05:49.770304 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.770195 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" event={"ID":"9ea9ebe0-ee87-4861-b4c4-b7866eef641d","Type":"ContainerStarted","Data":"5ea676a2a57ab2aa8278bd7a9241c062824d165e8cd3c2ab59bcd3dc286f4b70"} Apr 16 16:05:49.770412 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.770395 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:05:49.795461 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.795421 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:49.796383 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.796340 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" podStartSLOduration=1.79632795 podStartE2EDuration="1.79632795s" podCreationTimestamp="2026-04-16 16:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:49.794204214 +0000 UTC m=+169.234732669" watchObservedRunningTime="2026-04-16 16:05:49.79632795 +0000 UTC m=+169.236856404" Apr 16 16:05:49.798161 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.798139 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/745be784-393b-4faa-b213-9d593611899a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-gnvkb\" (UID: \"745be784-393b-4faa-b213-9d593611899a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:49.805992 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.805973 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" Apr 16 16:05:49.922083 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:49.922053 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb"] Apr 16 16:05:49.925732 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:05:49.925709 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745be784_393b_4faa_b213_9d593611899a.slice/crio-cfbedefdfae3b4ecb80322b2410bf48ecb4a369a09ad1fd6352395b2a6d7860c WatchSource:0}: Error finding container cfbedefdfae3b4ecb80322b2410bf48ecb4a369a09ad1fd6352395b2a6d7860c: Status 404 returned error can't find the container with id cfbedefdfae3b4ecb80322b2410bf48ecb4a369a09ad1fd6352395b2a6d7860c Apr 16 16:05:50.777604 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:50.777562 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" event={"ID":"745be784-393b-4faa-b213-9d593611899a","Type":"ContainerStarted","Data":"cfbedefdfae3b4ecb80322b2410bf48ecb4a369a09ad1fd6352395b2a6d7860c"} Apr 16 16:05:51.781970 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:51.781877 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdrp4" event={"ID":"6cabfaf7-a334-4298-988b-7cb9b7e35302","Type":"ContainerStarted","Data":"105f31e054f4708464fc5f92c61b46b13dd5c9451d5a1a937a6fc98ad3bb4252"} Apr 16 16:05:51.783190 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:51.783164 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" event={"ID":"745be784-393b-4faa-b213-9d593611899a","Type":"ContainerStarted","Data":"85a65f9820ee1aa1d057ec41237902c77da0ec71e8662957019bc26beba05a11"} Apr 16 16:05:51.803143 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:51.803103 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vdrp4" podStartSLOduration=1.528676161 podStartE2EDuration="3.803090683s" podCreationTimestamp="2026-04-16 16:05:48 +0000 UTC" firstStartedPulling="2026-04-16 16:05:49.122677409 +0000 UTC m=+168.563205843" lastFinishedPulling="2026-04-16 16:05:51.397091927 +0000 UTC m=+170.837620365" observedRunningTime="2026-04-16 16:05:51.801706418 +0000 UTC m=+171.242234877" watchObservedRunningTime="2026-04-16 16:05:51.803090683 +0000 UTC m=+171.243619138" Apr 16 16:05:52.756066 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:52.756035 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fm66n" Apr 16 16:05:52.787487 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:52.787368 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-gnvkb" podStartSLOduration=34.321155983 podStartE2EDuration="35.787352138s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:49.9274899 +0000 UTC m=+169.368018332" lastFinishedPulling="2026-04-16 16:05:51.39368605 +0000 UTC m=+170.834214487" observedRunningTime="2026-04-16 16:05:51.819661791 +0000 UTC m=+171.260190242" watchObservedRunningTime="2026-04-16 16:05:52.787352138 +0000 UTC m=+172.227880597" Apr 16 16:05:56.797075 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:56.797040 2566 generic.go:358] "Generic (PLEG): container finished" podID="eff58eab-eb6b-458d-bdfd-c30bc967ef8c" containerID="b91ce3951b2abe3f1bd445763b3150bfe92d95e0cc60d3208c29bd3ba7cb22db" exitCode=255 Apr 16 16:05:56.797469 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:56.797084 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" event={"ID":"eff58eab-eb6b-458d-bdfd-c30bc967ef8c","Type":"ContainerDied","Data":"b91ce3951b2abe3f1bd445763b3150bfe92d95e0cc60d3208c29bd3ba7cb22db"} Apr 16 16:05:56.802804 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:56.802784 2566 scope.go:117] "RemoveContainer" containerID="b91ce3951b2abe3f1bd445763b3150bfe92d95e0cc60d3208c29bd3ba7cb22db" Apr 16 16:05:57.801129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:57.801093 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c6d54cf79-ff67r" event={"ID":"eff58eab-eb6b-458d-bdfd-c30bc967ef8c","Type":"ContainerStarted","Data":"9b21978ea880d9234d2e44a47a5a74f84336f9e80ae57d4364b286555c646e20"} Apr 16 16:05:58.560850 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:05:58.560823 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:06:01.493332 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.493298 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nchbt"] Apr 16 16:06:01.496473 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.496453 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.499145 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.499124 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:06:01.499592 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.499575 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:06:01.500079 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.500061 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:06:01.500217 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.500195 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:06:01.500323 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.500245 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:06:01.500737 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.500720 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:06:01.500827 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.500737 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-87tn6\"" Apr 16 16:06:01.590013 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.589975 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590013 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590015 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-wtmp\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590034 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-accelerators-collector-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-sys\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590129 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhncw\" (UniqueName: \"kubernetes.io/projected/1a908bba-4a3e-4450-9923-8041ecce747a-kube-api-access-nhncw\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590143 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-root\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590244 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590195 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-textfile\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.590461 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.590250 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-metrics-client-ca\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.690832 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.690783 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-sys\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.690832 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.690842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhncw\" (UniqueName: \"kubernetes.io/projected/1a908bba-4a3e-4450-9923-8041ecce747a-kube-api-access-nhncw\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.690911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-sys\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.690952 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-root\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691002 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-textfile\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691022 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-metrics-client-ca\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-root\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691049 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691087 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-wtmp\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-accelerators-collector-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:06:01.691159 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691233 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-wtmp\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:06:01.691246 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls podName:1a908bba-4a3e-4450-9923-8041ecce747a nodeName:}" failed. No retries permitted until 2026-04-16 16:06:02.191225503 +0000 UTC m=+181.631753939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls") pod "node-exporter-nchbt" (UID: "1a908bba-4a3e-4450-9923-8041ecce747a") : secret "node-exporter-tls" not found Apr 16 16:06:01.691389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691347 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-textfile\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691645 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691607 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-metrics-client-ca\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.691760 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.691738 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-accelerators-collector-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.693369 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.693351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:01.699158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:01.699137 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhncw\" (UniqueName: \"kubernetes.io/projected/1a908bba-4a3e-4450-9923-8041ecce747a-kube-api-access-nhncw\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:02.194981 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:02.194945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:02.197254 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:02.197230 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1a908bba-4a3e-4450-9923-8041ecce747a-node-exporter-tls\") pod \"node-exporter-nchbt\" (UID: \"1a908bba-4a3e-4450-9923-8041ecce747a\") " pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:02.405693 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:02.405661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nchbt" Apr 16 16:06:02.413787 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:06:02.413762 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a908bba_4a3e_4450_9923_8041ecce747a.slice/crio-874a510c27f19e9b3918a66f974b07b73ebe5b1b3cd1515c3da2918158669dd0 WatchSource:0}: Error finding container 874a510c27f19e9b3918a66f974b07b73ebe5b1b3cd1515c3da2918158669dd0: Status 404 returned error can't find the container with id 874a510c27f19e9b3918a66f974b07b73ebe5b1b3cd1515c3da2918158669dd0 Apr 16 16:06:02.815682 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:02.815648 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nchbt" event={"ID":"1a908bba-4a3e-4450-9923-8041ecce747a","Type":"ContainerStarted","Data":"874a510c27f19e9b3918a66f974b07b73ebe5b1b3cd1515c3da2918158669dd0"} Apr 16 16:06:03.593982 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.593950 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc"] Apr 16 16:06:03.596559 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.596542 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.600434 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600406 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 16:06:03.600539 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600454 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 16:06:03.600604 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600535 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 16:06:03.600604 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600507 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 16:06:03.600733 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 16:06:03.600842 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.600825 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-19ug28gh8k971\"" Apr 16 16:06:03.601284 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.601264 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-txj96\"" Apr 16 16:06:03.605845 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.605960 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-grpc-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.605960 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605883 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrsx\" (UniqueName: \"kubernetes.io/projected/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-kube-api-access-rbrsx\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.605960 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605901 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.605960 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605918 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.606138 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.605996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.606138 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.606038 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-metrics-client-ca\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.606138 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.606094 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.619207 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.619181 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc"] Apr 16 16:06:03.707211 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707171 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707231 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-grpc-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrsx\" (UniqueName: \"kubernetes.io/projected/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-kube-api-access-rbrsx\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707303 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707333 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.707404 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.707373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-metrics-client-ca\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.708226 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.708177 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-metrics-client-ca\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710465 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710436 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710589 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710552 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710589 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710577 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710711 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710634 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710711 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710667 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.710793 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.710778 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-secret-grpc-tls\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.723790 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.723754 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrsx\" (UniqueName: \"kubernetes.io/projected/0dbc1cfe-574b-4558-b9a0-9dff690c8dc2-kube-api-access-rbrsx\") pod \"thanos-querier-f56b7dbf8-hh5gc\" (UID: \"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2\") " pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:03.819258 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.819224 2566 generic.go:358] "Generic (PLEG): container finished" podID="1a908bba-4a3e-4450-9923-8041ecce747a" containerID="86c512f5e5297d2b4a5f60f586484900e2730c6e26c25f43700138b392dd6706" exitCode=0 Apr 16 16:06:03.819258 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.819261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nchbt" event={"ID":"1a908bba-4a3e-4450-9923-8041ecce747a","Type":"ContainerDied","Data":"86c512f5e5297d2b4a5f60f586484900e2730c6e26c25f43700138b392dd6706"} Apr 16 16:06:03.905259 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:03.905235 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:04.039106 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:04.039070 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc"] Apr 16 16:06:04.041938 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:06:04.041914 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dbc1cfe_574b_4558_b9a0_9dff690c8dc2.slice/crio-8dac451a78d47ebd47442f63fd56b1d78066a3df330a8542b646d584423fce03 WatchSource:0}: Error finding container 8dac451a78d47ebd47442f63fd56b1d78066a3df330a8542b646d584423fce03: Status 404 returned error can't find the container with id 8dac451a78d47ebd47442f63fd56b1d78066a3df330a8542b646d584423fce03 Apr 16 16:06:04.823103 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:04.823068 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"8dac451a78d47ebd47442f63fd56b1d78066a3df330a8542b646d584423fce03"} Apr 16 16:06:04.824667 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:04.824638 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nchbt" event={"ID":"1a908bba-4a3e-4450-9923-8041ecce747a","Type":"ContainerStarted","Data":"e71c62e2866d96cfab7e4892d4a3fdf81b76e104b946a51cdbfc014e82937184"} Apr 16 16:06:04.824792 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:04.824673 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nchbt" event={"ID":"1a908bba-4a3e-4450-9923-8041ecce747a","Type":"ContainerStarted","Data":"f7cdb3ce96cafeb6c33ba3efb9b4d5573e3dfb0154d74bf1c438bb88615fe1eb"} Apr 16 16:06:04.848765 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:04.848718 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nchbt" podStartSLOduration=3.134710811 podStartE2EDuration="3.848704696s" podCreationTimestamp="2026-04-16 16:06:01 +0000 UTC" firstStartedPulling="2026-04-16 16:06:02.415335084 +0000 UTC m=+181.855863516" lastFinishedPulling="2026-04-16 16:06:03.129328953 +0000 UTC m=+182.569857401" observedRunningTime="2026-04-16 16:06:04.846913938 +0000 UTC m=+184.287442427" watchObservedRunningTime="2026-04-16 16:06:04.848704696 +0000 UTC m=+184.289233151" Apr 16 16:06:06.831354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:06.831320 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"e3b7ad4852cb04e2ed210d67590c7e133e4668262511e5e6bc505e380201a888"} Apr 16 16:06:06.831354 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:06.831355 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"a98438df000b7668a56108a5005179315678c93a536a3c6ae68d68348e65fc8e"} Apr 16 16:06:06.831778 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:06.831365 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"98a450936bb7532a04e4a866f2c2470c0a13b5ac8ff820270f98d8b76837e7d1"} Apr 16 16:06:07.836947 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.836911 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"ad1b4982c67f1d580c46eccc3b67e7823b073bdc54efa7375b397ffa6cc302d6"} Apr 16 16:06:07.836947 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.836947 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"eaf6bc054dd811eb65e13ea8d607aeb58100cc93a348f9ec3a5cf3f54b022f40"} Apr 16 16:06:07.837389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.836957 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" event={"ID":"0dbc1cfe-574b-4558-b9a0-9dff690c8dc2","Type":"ContainerStarted","Data":"c69c8d8d8c9b30fdde4cdcd7834897abeeb7b4e616ed0d9d59367753520b684a"} Apr 16 16:06:07.837389 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.837043 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:07.859223 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.859200 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:06:07.862112 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.862095 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.869547 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869525 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:06:07.869547 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869537 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869559 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869607 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869526 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869526 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869543 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:06:07.869730 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869629 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:06:07.870018 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869588 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:06:07.870018 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869769 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:06:07.870018 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869928 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:06:07.870018 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.869991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xg5bf\"" Apr 16 16:06:07.870018 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.870004 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:06:07.870843 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.870828 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8fiqqigdojfrb\"" Apr 16 16:06:07.872222 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.872193 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:06:07.884564 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.884522 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" podStartSLOduration=1.698556574 podStartE2EDuration="4.884511218s" podCreationTimestamp="2026-04-16 16:06:03 +0000 UTC" firstStartedPulling="2026-04-16 16:06:04.043725556 +0000 UTC m=+183.484253989" lastFinishedPulling="2026-04-16 16:06:07.229680198 +0000 UTC m=+186.670208633" observedRunningTime="2026-04-16 16:06:07.883010923 +0000 UTC m=+187.323539378" watchObservedRunningTime="2026-04-16 16:06:07.884511218 +0000 UTC m=+187.325039673" Apr 16 16:06:07.890184 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.890163 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:06:07.943430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943430 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943434 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943520 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943555 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943597 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943648 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943670 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943984 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943984 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943984 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943885 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.943984 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.943955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944229 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944229 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944163 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944229 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944192 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944369 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944244 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944369 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:07.944369 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:07.944343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtv8\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.044928 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.044891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.044928 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.044929 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtv8\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045150 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.044948 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045150 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.044966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045150 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.044988 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045150 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045135 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045357 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045183 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045357 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045357 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045259 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045357 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045294 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045357 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045582 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045582 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045582 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045496 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045582 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045805 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045805 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.045805 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.045675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.046339 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.046312 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.048822 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.048320 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.048822 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.048706 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.049285 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.049395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.049674 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.049842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050080 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050086 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050332 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050418 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050390 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050920 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.050920 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.051029 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.050973 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.052009 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.051988 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.052083 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.052045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.052283 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.052266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.056795 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.056777 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtv8\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8\") pod \"prometheus-k8s-0\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.171520 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.171488 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:08.301130 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.300975 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:06:08.303583 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:06:08.303556 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14919623_8ba1_4709_a344_79cb68341e3c.slice/crio-eae3ad24ab73e36889780c2042dd6ce6d384879b06adfa5f9162767185594102 WatchSource:0}: Error finding container eae3ad24ab73e36889780c2042dd6ce6d384879b06adfa5f9162767185594102: Status 404 returned error can't find the container with id eae3ad24ab73e36889780c2042dd6ce6d384879b06adfa5f9162767185594102 Apr 16 16:06:08.841044 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:08.840990 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"eae3ad24ab73e36889780c2042dd6ce6d384879b06adfa5f9162767185594102"} Apr 16 16:06:09.845144 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:09.845108 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70" exitCode=0 Apr 16 16:06:09.845567 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:09.845170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70"} Apr 16 16:06:10.782193 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:10.782157 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77c9bfcb4d-8wlph" Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8"} Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858836 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858"} Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754"} Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858855 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12"} Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858863 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3"} Apr 16 16:06:12.858882 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.858870 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerStarted","Data":"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424"} Apr 16 16:06:12.891729 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:12.891681 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.039602275 podStartE2EDuration="5.891668249s" podCreationTimestamp="2026-04-16 16:06:07 +0000 UTC" firstStartedPulling="2026-04-16 16:06:08.305469042 +0000 UTC m=+187.745997476" lastFinishedPulling="2026-04-16 16:06:12.157535011 +0000 UTC m=+191.598063450" observedRunningTime="2026-04-16 16:06:12.890472733 +0000 UTC m=+192.331001214" watchObservedRunningTime="2026-04-16 16:06:12.891668249 +0000 UTC m=+192.332196705" Apr 16 16:06:13.171633 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.171578 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:13.576788 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.576672 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" containerName="registry" containerID="cri-o://145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770" gracePeriod=30 Apr 16 16:06:13.808709 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.808685 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:06:13.847198 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.847119 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f56b7dbf8-hh5gc" Apr 16 16:06:13.862697 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.862665 2566 generic.go:358] "Generic (PLEG): container finished" podID="9c6643ce-819b-493d-a650-95da3e927c4e" containerID="145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770" exitCode=0 Apr 16 16:06:13.863051 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.862723 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" Apr 16 16:06:13.863051 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.862753 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" event={"ID":"9c6643ce-819b-493d-a650-95da3e927c4e","Type":"ContainerDied","Data":"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770"} Apr 16 16:06:13.863051 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.862795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665647ff5f-shgzl" event={"ID":"9c6643ce-819b-493d-a650-95da3e927c4e","Type":"ContainerDied","Data":"3b05f54f0b02ba559be70b229b39bdbc6c9fc6b524e533f4920e8ec2a170637a"} Apr 16 16:06:13.863051 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.862821 2566 scope.go:117] "RemoveContainer" containerID="145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770" Apr 16 16:06:13.870188 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.870164 2566 scope.go:117] "RemoveContainer" containerID="145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770" Apr 16 16:06:13.870410 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:06:13.870391 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770\": container with ID starting with 145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770 not found: ID does not exist" containerID="145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770" Apr 16 16:06:13.870467 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.870418 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770"} err="failed to get container status \"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770\": rpc error: code = NotFound desc = could not find container \"145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770\": container with ID starting with 145642b05a7ec1f3ac54fc42e59f861f3cd3ac2b0c32a3bafe651f827ccad770 not found: ID does not exist" Apr 16 16:06:13.895105 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895070 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895268 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895123 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8nw4\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895268 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895165 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895268 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895192 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895268 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895224 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895268 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895252 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895538 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895291 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.895538 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.895324 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration\") pod \"9c6643ce-819b-493d-a650-95da3e927c4e\" (UID: \"9c6643ce-819b-493d-a650-95da3e927c4e\") " Apr 16 16:06:13.898287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.896324 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:13.898287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.897252 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:13.898287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.897945 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:13.898287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.898064 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:13.898287 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.898266 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4" (OuterVolumeSpecName: "kube-api-access-t8nw4") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "kube-api-access-t8nw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:13.899926 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.899901 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:13.900249 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.900220 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:13.905672 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.905649 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9c6643ce-819b-493d-a650-95da3e927c4e" (UID: "9c6643ce-819b-493d-a650-95da3e927c4e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:06:13.996971 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.996933 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t8nw4\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-kube-api-access-t8nw4\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.996971 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.996964 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c6643ce-819b-493d-a650-95da3e927c4e-ca-trust-extracted\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.996971 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.996973 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-registry-certificates\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.997199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.996982 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-installation-pull-secrets\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.997199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.996992 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c6643ce-819b-493d-a650-95da3e927c4e-trusted-ca\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.997199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.997000 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-bound-sa-token\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.997199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.997009 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c6643ce-819b-493d-a650-95da3e927c4e-image-registry-private-configuration\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.997199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:13.997018 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c6643ce-819b-493d-a650-95da3e927c4e-registry-tls\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:06:14.186926 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:14.186897 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:06:14.195553 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:14.195523 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-665647ff5f-shgzl"] Apr 16 16:06:15.203506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:15.203473 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" path="/var/lib/kubelet/pods/9c6643ce-819b-493d-a650-95da3e927c4e/volumes" Apr 16 16:06:31.917880 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:31.917844 2566 generic.go:358] "Generic (PLEG): container finished" podID="d99110ea-ed08-4c60-a0df-4d2bbde55794" containerID="769f560637c0e369f06c6a71973411952c9503fa3fcb5be0f864e2f00fbac8d6" exitCode=0 Apr 16 16:06:31.918248 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:31.917895 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" event={"ID":"d99110ea-ed08-4c60-a0df-4d2bbde55794","Type":"ContainerDied","Data":"769f560637c0e369f06c6a71973411952c9503fa3fcb5be0f864e2f00fbac8d6"} Apr 16 16:06:31.918248 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:31.918195 2566 scope.go:117] "RemoveContainer" containerID="769f560637c0e369f06c6a71973411952c9503fa3fcb5be0f864e2f00fbac8d6" Apr 16 16:06:32.922716 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:06:32.922683 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-j6lc7" event={"ID":"d99110ea-ed08-4c60-a0df-4d2bbde55794","Type":"ContainerStarted","Data":"c1f1326b2d3a2d9c0c0f9801d7dd769e22638e12171ae25490a0ba10396eeda9"} Apr 16 16:07:08.172104 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:08.172060 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:08.187905 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:08.187873 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:09.032213 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:09.032187 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:11.987359 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:11.987268 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:07:11.989674 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:11.989649 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e5c15b5-f8c7-478b-a327-14aad8952c3f-metrics-certs\") pod \"network-metrics-daemon-4kkrg\" (UID: \"3e5c15b5-f8c7-478b-a327-14aad8952c3f\") " pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:07:12.202649 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:12.202605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:07:12.210123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:12.210103 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4kkrg" Apr 16 16:07:12.324992 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:12.324961 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4kkrg"] Apr 16 16:07:12.328568 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:07:12.328537 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e5c15b5_f8c7_478b_a327_14aad8952c3f.slice/crio-f1b16638e838db88aae23d2e601e143fa06a0a636b3b629a9b551ff20b68cc2c WatchSource:0}: Error finding container f1b16638e838db88aae23d2e601e143fa06a0a636b3b629a9b551ff20b68cc2c: Status 404 returned error can't find the container with id f1b16638e838db88aae23d2e601e143fa06a0a636b3b629a9b551ff20b68cc2c Apr 16 16:07:13.032041 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:13.031998 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kkrg" event={"ID":"3e5c15b5-f8c7-478b-a327-14aad8952c3f","Type":"ContainerStarted","Data":"f1b16638e838db88aae23d2e601e143fa06a0a636b3b629a9b551ff20b68cc2c"} Apr 16 16:07:14.035704 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:14.035668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kkrg" event={"ID":"3e5c15b5-f8c7-478b-a327-14aad8952c3f","Type":"ContainerStarted","Data":"99db2cc80195321592176c2437d1562187334b995f2dfa0fad934fa772e1e592"} Apr 16 16:07:14.035704 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:14.035704 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4kkrg" event={"ID":"3e5c15b5-f8c7-478b-a327-14aad8952c3f","Type":"ContainerStarted","Data":"ba0144be6acd52aaaf8c67a670cdd2b728237ef4226b3114c0e51ee5bcd2537d"} Apr 16 16:07:14.054369 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:14.054319 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4kkrg" podStartSLOduration=252.038799632 podStartE2EDuration="4m13.054301501s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:07:12.330751628 +0000 UTC m=+251.771280063" lastFinishedPulling="2026-04-16 16:07:13.346253493 +0000 UTC m=+252.786781932" observedRunningTime="2026-04-16 16:07:14.053691087 +0000 UTC m=+253.494219542" watchObservedRunningTime="2026-04-16 16:07:14.054301501 +0000 UTC m=+253.494829955" Apr 16 16:07:26.212156 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212121 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:26.212547 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212524 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="prometheus" containerID="cri-o://72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424" gracePeriod=600 Apr 16 16:07:26.212639 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212556 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy" containerID="cri-o://b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858" gracePeriod=600 Apr 16 16:07:26.212639 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212575 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="thanos-sidecar" containerID="cri-o://da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12" gracePeriod=600 Apr 16 16:07:26.212743 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212596 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="config-reloader" containerID="cri-o://940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3" gracePeriod=600 Apr 16 16:07:26.212743 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212642 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8" gracePeriod=600 Apr 16 16:07:26.212843 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:26.212645 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-web" containerID="cri-o://ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754" gracePeriod=600 Apr 16 16:07:27.084468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084431 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8" exitCode=0 Apr 16 16:07:27.084468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084456 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858" exitCode=0 Apr 16 16:07:27.084468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084462 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12" exitCode=0 Apr 16 16:07:27.084468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084470 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3" exitCode=0 Apr 16 16:07:27.084468 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084475 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424" exitCode=0 Apr 16 16:07:27.084803 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8"} Apr 16 16:07:27.084803 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084540 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858"} Apr 16 16:07:27.084803 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12"} Apr 16 16:07:27.084803 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3"} Apr 16 16:07:27.084803 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.084577 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424"} Apr 16 16:07:27.458250 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.458225 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:27.613504 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613464 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613694 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613521 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613694 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613546 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613694 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613570 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613877 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613719 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613877 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613784 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613877 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613815 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.613877 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.613852 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614177 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614105 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:27.614177 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614163 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614198 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614221 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614243 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614248 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614267 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdtv8\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614293 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614346 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614329 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614712 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614365 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614712 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614393 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614712 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614418 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy\") pod \"14919623-8ba1-4709-a344-79cb68341e3c\" (UID: \"14919623-8ba1-4709-a344-79cb68341e3c\") " Apr 16 16:07:27.614712 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614683 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.614712 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.614703 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-metrics-client-ca\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.616845 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.616691 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.616974 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.616950 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.617040 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.616996 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.617040 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.617012 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.617158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.617036 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.617158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.617054 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:27.617429 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.617402 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:27.617565 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.617404 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:27.618296 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.618230 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:07:27.618932 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.618896 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.619052 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.619002 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:27.619153 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.619133 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config" (OuterVolumeSpecName: "config") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.619301 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.619277 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.619434 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.619416 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8" (OuterVolumeSpecName: "kube-api-access-tdtv8") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "kube-api-access-tdtv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:27.619643 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.619606 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out" (OuterVolumeSpecName: "config-out") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:07:27.627319 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.627269 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config" (OuterVolumeSpecName: "web-config") pod "14919623-8ba1-4709-a344-79cb68341e3c" (UID: "14919623-8ba1-4709-a344-79cb68341e3c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:27.715698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715651 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-web-config\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715696 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715708 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715718 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715698 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715728 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-tls-assets\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715738 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-db\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715747 2566 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-config\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715755 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdtv8\" (UniqueName: \"kubernetes.io/projected/14919623-8ba1-4709-a344-79cb68341e3c-kube-api-access-tdtv8\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715765 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715775 2566 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-grpc-tls\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715784 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14919623-8ba1-4709-a344-79cb68341e3c-config-out\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715794 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14919623-8ba1-4709-a344-79cb68341e3c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715802 2566 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-kube-rbac-proxy\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715811 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715835 2566 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-secret-metrics-client-certs\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:27.715994 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:27.715844 2566 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14919623-8ba1-4709-a344-79cb68341e3c-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-220.ec2.internal\" DevicePath \"\"" Apr 16 16:07:28.092473 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.092437 2566 generic.go:358] "Generic (PLEG): container finished" podID="14919623-8ba1-4709-a344-79cb68341e3c" containerID="ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754" exitCode=0 Apr 16 16:07:28.092690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.092506 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754"} Apr 16 16:07:28.092690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.092544 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14919623-8ba1-4709-a344-79cb68341e3c","Type":"ContainerDied","Data":"eae3ad24ab73e36889780c2042dd6ce6d384879b06adfa5f9162767185594102"} Apr 16 16:07:28.092690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.092564 2566 scope.go:117] "RemoveContainer" containerID="fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8" Apr 16 16:07:28.092690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.092586 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.099992 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.099972 2566 scope.go:117] "RemoveContainer" containerID="b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858" Apr 16 16:07:28.107388 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.107370 2566 scope.go:117] "RemoveContainer" containerID="ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754" Apr 16 16:07:28.113239 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.113221 2566 scope.go:117] "RemoveContainer" containerID="da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12" Apr 16 16:07:28.117109 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.117088 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:28.120095 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.120073 2566 scope.go:117] "RemoveContainer" containerID="940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3" Apr 16 16:07:28.125054 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.125030 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:28.126868 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.126851 2566 scope.go:117] "RemoveContainer" containerID="72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424" Apr 16 16:07:28.133223 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.133207 2566 scope.go:117] "RemoveContainer" containerID="9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70" Apr 16 16:07:28.139371 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.139351 2566 scope.go:117] "RemoveContainer" containerID="fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8" Apr 16 16:07:28.139647 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.139605 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8\": container with ID starting with fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8 not found: ID does not exist" containerID="fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8" Apr 16 16:07:28.139714 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.139657 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8"} err="failed to get container status \"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8\": rpc error: code = NotFound desc = could not find container \"fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8\": container with ID starting with fd56437d645f8a47509b30af7ed19af8888cb0cd766cb72d4fedad0c7015a8f8 not found: ID does not exist" Apr 16 16:07:28.139714 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.139676 2566 scope.go:117] "RemoveContainer" containerID="b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858" Apr 16 16:07:28.139934 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.139917 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858\": container with ID starting with b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858 not found: ID does not exist" containerID="b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858" Apr 16 16:07:28.139996 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.139943 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858"} err="failed to get container status \"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858\": rpc error: code = NotFound desc = could not find container \"b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858\": container with ID starting with b2db94d1059c9d59f50d3461e87004ebc1a2d6b3a65cc613c0c94972391a0858 not found: ID does not exist" Apr 16 16:07:28.139996 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.139977 2566 scope.go:117] "RemoveContainer" containerID="ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754" Apr 16 16:07:28.140210 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.140191 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754\": container with ID starting with ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754 not found: ID does not exist" containerID="ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754" Apr 16 16:07:28.140321 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140215 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754"} err="failed to get container status \"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754\": rpc error: code = NotFound desc = could not find container \"ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754\": container with ID starting with ed00857d2dec197423e0d8694e3dd7e556aedc9419dd554557582a14ea13a754 not found: ID does not exist" Apr 16 16:07:28.140321 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140233 2566 scope.go:117] "RemoveContainer" containerID="da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12" Apr 16 16:07:28.140438 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.140423 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12\": container with ID starting with da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12 not found: ID does not exist" containerID="da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12" Apr 16 16:07:28.140477 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140441 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12"} err="failed to get container status \"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12\": rpc error: code = NotFound desc = could not find container \"da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12\": container with ID starting with da481ad1f3020168c6e735ad57a36caec1f599d3f810f84bb5bd29dfbccc2a12 not found: ID does not exist" Apr 16 16:07:28.140477 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140454 2566 scope.go:117] "RemoveContainer" containerID="940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3" Apr 16 16:07:28.140652 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.140634 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3\": container with ID starting with 940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3 not found: ID does not exist" containerID="940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3" Apr 16 16:07:28.140690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140656 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3"} err="failed to get container status \"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3\": rpc error: code = NotFound desc = could not find container \"940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3\": container with ID starting with 940d8d5291375b4ed95efc73e32825ccf47861730f57127235a527d489618cd3 not found: ID does not exist" Apr 16 16:07:28.140690 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140669 2566 scope.go:117] "RemoveContainer" containerID="72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424" Apr 16 16:07:28.140849 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.140836 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424\": container with ID starting with 72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424 not found: ID does not exist" containerID="72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424" Apr 16 16:07:28.140888 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140851 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424"} err="failed to get container status \"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424\": rpc error: code = NotFound desc = could not find container \"72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424\": container with ID starting with 72da6032c5de4e2bcc5cb454c8b52ad74c734cc6ea413070ed14f67a5b066424 not found: ID does not exist" Apr 16 16:07:28.140888 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.140863 2566 scope.go:117] "RemoveContainer" containerID="9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70" Apr 16 16:07:28.141071 ip-10-0-136-220 kubenswrapper[2566]: E0416 16:07:28.141056 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70\": container with ID starting with 9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70 not found: ID does not exist" containerID="9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70" Apr 16 16:07:28.141111 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.141072 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70"} err="failed to get container status \"9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70\": rpc error: code = NotFound desc = could not find container \"9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70\": container with ID starting with 9642472d0e5a693cdb31f5b5aa8aea21b75acd937473b815147e245197a99f70 not found: ID does not exist" Apr 16 16:07:28.153989 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.153958 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:28.154336 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154320 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy" Apr 16 16:07:28.154384 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154340 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy" Apr 16 16:07:28.154384 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154354 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-web" Apr 16 16:07:28.154384 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154363 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-web" Apr 16 16:07:28.154384 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154375 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" containerName="registry" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154386 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" containerName="registry" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154397 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="prometheus" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154405 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="prometheus" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154413 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="thanos-sidecar" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154422 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="thanos-sidecar" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154433 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-thanos" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154441 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-thanos" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154458 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="init-config-reloader" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154466 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="init-config-reloader" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154475 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="config-reloader" Apr 16 16:07:28.154526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154483 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="config-reloader" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154535 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-web" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154543 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="prometheus" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154550 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c6643ce-819b-493d-a650-95da3e927c4e" containerName="registry" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154556 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="thanos-sidecar" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154564 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="config-reloader" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154569 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy-thanos" Apr 16 16:07:28.154852 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.154576 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="14919623-8ba1-4709-a344-79cb68341e3c" containerName="kube-rbac-proxy" Apr 16 16:07:28.158900 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.158885 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.162199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.162177 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:07:28.162199 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.162191 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:07:28.162337 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.162177 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:07:28.163790 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.163768 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:07:28.164608 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.164578 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:07:28.164713 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.164638 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:07:28.164713 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.164657 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:07:28.164834 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.164657 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:07:28.164834 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.164755 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:07:28.165905 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.165888 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8fiqqigdojfrb\"" Apr 16 16:07:28.166115 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.166098 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:07:28.166429 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.166411 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:07:28.166522 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.166461 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xg5bf\"" Apr 16 16:07:28.170162 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.170095 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:07:28.175238 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.174684 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:28.175860 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.175822 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:07:28.320938 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.320897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.320938 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.320933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321195 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.320956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321195 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321195 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321060 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321195 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321080 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321195 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321229 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321262 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321353 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxrw\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-kube-api-access-6xxrw\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321371 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321387 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321665 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321401 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321665 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321420 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321665 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321448 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.321665 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.321499 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422254 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422204 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422254 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422261 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxrw\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-kube-api-access-6xxrw\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422467 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422283 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422467 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422467 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422319 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422467 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422704 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422777 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422694 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422777 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422875 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422782 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422875 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422875 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422836 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.422875 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422870 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422941 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.422971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.423004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.423034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423123 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.423058 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423410 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.423197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.423948 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.423925 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.425526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.425499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.425526 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.425513 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.425707 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.425606 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.425961 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.425935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.426025 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.425958 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.426502 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.426196 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.426502 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.426234 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.426729 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.426695 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.427229 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.427201 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-config\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.427711 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.427691 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c64e033c-09ac-45bc-acb4-9059ab2582a8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.428094 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.428075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.428158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.428090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c64e033c-09ac-45bc-acb4-9059ab2582a8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.428158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.428128 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.428517 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.428499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c64e033c-09ac-45bc-acb4-9059ab2582a8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.431558 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.431541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxrw\" (UniqueName: \"kubernetes.io/projected/c64e033c-09ac-45bc-acb4-9059ab2582a8-kube-api-access-6xxrw\") pod \"prometheus-k8s-0\" (UID: \"c64e033c-09ac-45bc-acb4-9059ab2582a8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.467685 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.467650 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:07:28.598685 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:28.598659 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:07:28.601308 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:07:28.601277 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc64e033c_09ac_45bc_acb4_9059ab2582a8.slice/crio-4309c14ec1ee8fc1448aea4a21c90a8a41fe777faf432542165e83128e6c003b WatchSource:0}: Error finding container 4309c14ec1ee8fc1448aea4a21c90a8a41fe777faf432542165e83128e6c003b: Status 404 returned error can't find the container with id 4309c14ec1ee8fc1448aea4a21c90a8a41fe777faf432542165e83128e6c003b Apr 16 16:07:29.097557 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:29.097461 2566 generic.go:358] "Generic (PLEG): container finished" podID="c64e033c-09ac-45bc-acb4-9059ab2582a8" containerID="2e129e2929ee264c5e7f16ef0047666e61759133541f0f32adcb21192df80bd4" exitCode=0 Apr 16 16:07:29.097557 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:29.097532 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerDied","Data":"2e129e2929ee264c5e7f16ef0047666e61759133541f0f32adcb21192df80bd4"} Apr 16 16:07:29.097557 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:29.097552 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"4309c14ec1ee8fc1448aea4a21c90a8a41fe777faf432542165e83128e6c003b"} Apr 16 16:07:29.206944 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:29.206915 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14919623-8ba1-4709-a344-79cb68341e3c" path="/var/lib/kubelet/pods/14919623-8ba1-4709-a344-79cb68341e3c/volumes" Apr 16 16:07:30.103556 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103525 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"d0b3e2e69960cec8bd5535bd9b0948bb6ef71e89798f354759dfef8286ad3a3a"} Apr 16 16:07:30.103556 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103559 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"8e0182029746cb4c657448eb02e74fb0c7d1301712253e0d3d8e83b4903931a3"} Apr 16 16:07:30.103986 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103569 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"92c9fd558c25d119769c008132db6456718bcf55cae94fa18c24a962284435d8"} Apr 16 16:07:30.103986 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103578 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"c719e021a68a74428f69107900d852080c512c3e50b0f849fb6befda7cb3f026"} Apr 16 16:07:30.103986 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103585 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"8d3a2b2b843d5a5792bcd06478622cf1bac1847935eb461ab758f2e99d8f7dd5"} Apr 16 16:07:30.103986 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.103593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c64e033c-09ac-45bc-acb4-9059ab2582a8","Type":"ContainerStarted","Data":"777bf5cf899fca95b587ab1f8e51f3ef9d3db6cc6d6c513fe8d0fef1e21df2fb"} Apr 16 16:07:30.134820 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:30.134764 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.134745575 podStartE2EDuration="2.134745575s" podCreationTimestamp="2026-04-16 16:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:07:30.133147624 +0000 UTC m=+269.573676103" watchObservedRunningTime="2026-04-16 16:07:30.134745575 +0000 UTC m=+269.575274032" Apr 16 16:07:33.468224 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:07:33.468183 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:08:01.108939 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:01.108909 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:08:01.111489 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:01.111458 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:08:01.114093 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:01.114065 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:08:28.468282 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:28.468245 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:08:28.483517 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:28.483493 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:08:29.281540 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:08:29.281512 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:09:34.225422 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.225387 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s4gzd"] Apr 16 16:09:34.227539 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.227520 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.230243 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.230223 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:09:34.237650 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.237608 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s4gzd"] Apr 16 16:09:34.323063 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.323025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-kubelet-config\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.323283 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.323155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-dbus\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.323371 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.323308 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d375da20-55b8-475b-ac7b-781fd8109e28-original-pull-secret\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.424129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.424100 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-kubelet-config\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.424281 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.424145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-dbus\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.424281 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.424199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d375da20-55b8-475b-ac7b-781fd8109e28-original-pull-secret\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.424281 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.424225 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-kubelet-config\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.424393 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.424374 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d375da20-55b8-475b-ac7b-781fd8109e28-dbus\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.426494 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.426465 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d375da20-55b8-475b-ac7b-781fd8109e28-original-pull-secret\") pod \"global-pull-secret-syncer-s4gzd\" (UID: \"d375da20-55b8-475b-ac7b-781fd8109e28\") " pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.536769 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.536675 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4gzd" Apr 16 16:09:34.654666 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.654643 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s4gzd"] Apr 16 16:09:34.658705 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:09:34.658672 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd375da20_55b8_475b_ac7b_781fd8109e28.slice/crio-a7ad1c410d2611e708ffb40b1061007afcc2adfba934cbe2e30d6db32535afd4 WatchSource:0}: Error finding container a7ad1c410d2611e708ffb40b1061007afcc2adfba934cbe2e30d6db32535afd4: Status 404 returned error can't find the container with id a7ad1c410d2611e708ffb40b1061007afcc2adfba934cbe2e30d6db32535afd4 Apr 16 16:09:34.660235 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:34.660215 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:09:35.448915 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:35.448876 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s4gzd" event={"ID":"d375da20-55b8-475b-ac7b-781fd8109e28","Type":"ContainerStarted","Data":"a7ad1c410d2611e708ffb40b1061007afcc2adfba934cbe2e30d6db32535afd4"} Apr 16 16:09:39.460432 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:39.460390 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s4gzd" event={"ID":"d375da20-55b8-475b-ac7b-781fd8109e28","Type":"ContainerStarted","Data":"d3c0a2260976ee24cc936818f5d31b3fedafb49f4e4503fdbba00c41bf16d4b3"} Apr 16 16:09:39.501172 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:09:39.501117 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s4gzd" podStartSLOduration=1.339210647 podStartE2EDuration="5.501101603s" podCreationTimestamp="2026-04-16 16:09:34 +0000 UTC" firstStartedPulling="2026-04-16 16:09:34.660335064 +0000 UTC m=+394.100863498" lastFinishedPulling="2026-04-16 16:09:38.82222601 +0000 UTC m=+398.262754454" observedRunningTime="2026-04-16 16:09:39.497967987 +0000 UTC m=+398.938496441" watchObservedRunningTime="2026-04-16 16:09:39.501101603 +0000 UTC m=+398.941630058" Apr 16 16:13:01.129982 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:13:01.129956 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:13:01.130470 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:13:01.130136 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:18:01.149100 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:18:01.149032 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:18:01.149846 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:18:01.149825 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:23:01.168995 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:23:01.168967 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:23:01.169554 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:23:01.169003 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:27:16.988440 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.988403 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4jnll/must-gather-shpth"] Apr 16 16:27:16.991587 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.991571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:16.994089 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.994070 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"kube-root-ca.crt\"" Apr 16 16:27:16.994213 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.994195 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4jnll\"/\"default-dockercfg-bq59w\"" Apr 16 16:27:16.994285 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.994270 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"openshift-service-ca.crt\"" Apr 16 16:27:16.997867 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:16.997843 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/must-gather-shpth"] Apr 16 16:27:17.014301 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.014272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-must-gather-output\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.014421 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.014307 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnhl\" (UniqueName: \"kubernetes.io/projected/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-kube-api-access-rnnhl\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.115351 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.115308 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-must-gather-output\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.115351 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.115347 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnhl\" (UniqueName: \"kubernetes.io/projected/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-kube-api-access-rnnhl\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.115765 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.115744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-must-gather-output\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.128306 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.128276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnhl\" (UniqueName: \"kubernetes.io/projected/f445d32c-fc16-4c3f-8ab3-1ee328db7a4e-kube-api-access-rnnhl\") pod \"must-gather-shpth\" (UID: \"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e\") " pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.301469 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.301383 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/must-gather-shpth" Apr 16 16:27:17.421604 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.421581 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/must-gather-shpth"] Apr 16 16:27:17.423601 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:27:17.423575 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf445d32c_fc16_4c3f_8ab3_1ee328db7a4e.slice/crio-71b9c0f630ee29ad8b499e60ff539523da6b6cc4233ec6db556bccacca2723f8 WatchSource:0}: Error finding container 71b9c0f630ee29ad8b499e60ff539523da6b6cc4233ec6db556bccacca2723f8: Status 404 returned error can't find the container with id 71b9c0f630ee29ad8b499e60ff539523da6b6cc4233ec6db556bccacca2723f8 Apr 16 16:27:17.425405 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:17.425390 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:27:18.349899 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:18.349866 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/must-gather-shpth" event={"ID":"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e","Type":"ContainerStarted","Data":"71b9c0f630ee29ad8b499e60ff539523da6b6cc4233ec6db556bccacca2723f8"} Apr 16 16:27:19.356934 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.356893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/must-gather-shpth" event={"ID":"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e","Type":"ContainerStarted","Data":"4f7c500ed0dcc7b25f46952d28eb95babce953441971716822708840ac34d7fd"} Apr 16 16:27:19.357400 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.356946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/must-gather-shpth" event={"ID":"f445d32c-fc16-4c3f-8ab3-1ee328db7a4e","Type":"ContainerStarted","Data":"9f4a7053f75e8e2d8dceb574fd5df900d742f7f919b2bc1db45f623101a33fc0"} Apr 16 16:27:19.372037 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.371839 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4jnll/must-gather-shpth" podStartSLOduration=2.506556812 podStartE2EDuration="3.371822443s" podCreationTimestamp="2026-04-16 16:27:16 +0000 UTC" firstStartedPulling="2026-04-16 16:27:17.425537648 +0000 UTC m=+1456.866066080" lastFinishedPulling="2026-04-16 16:27:18.290803276 +0000 UTC m=+1457.731331711" observedRunningTime="2026-04-16 16:27:19.370872477 +0000 UTC m=+1458.811400931" watchObservedRunningTime="2026-04-16 16:27:19.371822443 +0000 UTC m=+1458.812350899" Apr 16 16:27:19.744248 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.744219 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s4gzd_d375da20-55b8-475b-ac7b-781fd8109e28/global-pull-secret-syncer/0.log" Apr 16 16:27:19.794498 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.794469 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9wz49_48f3d044-df7a-4aec-b463-07afb7514443/konnectivity-agent/0.log" Apr 16 16:27:19.910326 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:19.910297 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-220.ec2.internal_ba662560bc4f4ac2db429e0e7b17ae15/haproxy/0.log" Apr 16 16:27:23.775965 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.775935 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nchbt_1a908bba-4a3e-4450-9923-8041ecce747a/node-exporter/0.log" Apr 16 16:27:23.800083 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.800055 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nchbt_1a908bba-4a3e-4450-9923-8041ecce747a/kube-rbac-proxy/0.log" Apr 16 16:27:23.821681 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.821650 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nchbt_1a908bba-4a3e-4450-9923-8041ecce747a/init-textfile/0.log" Apr 16 16:27:23.929995 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.929968 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/prometheus/0.log" Apr 16 16:27:23.957379 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.957336 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/config-reloader/0.log" Apr 16 16:27:23.985045 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:23.984973 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/thanos-sidecar/0.log" Apr 16 16:27:24.006551 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.006527 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/kube-rbac-proxy-web/0.log" Apr 16 16:27:24.030129 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.030049 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/kube-rbac-proxy/0.log" Apr 16 16:27:24.055652 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.055593 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/kube-rbac-proxy-thanos/0.log" Apr 16 16:27:24.079781 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.079748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c64e033c-09ac-45bc-acb4-9059ab2582a8/init-config-reloader/0.log" Apr 16 16:27:24.272249 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.272224 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/thanos-query/0.log" Apr 16 16:27:24.293237 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.293159 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/kube-rbac-proxy-web/0.log" Apr 16 16:27:24.316417 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.316393 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/kube-rbac-proxy/0.log" Apr 16 16:27:24.336085 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.336060 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/prom-label-proxy/0.log" Apr 16 16:27:24.360416 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.360383 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/kube-rbac-proxy-rules/0.log" Apr 16 16:27:24.388733 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:24.388702 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f56b7dbf8-hh5gc_0dbc1cfe-574b-4558-b9a0-9dff690c8dc2/kube-rbac-proxy-metrics/0.log" Apr 16 16:27:25.558812 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:25.558774 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-gnvkb_745be784-393b-4faa-b213-9d593611899a/networking-console-plugin/0.log" Apr 16 16:27:26.988544 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:26.988507 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9"] Apr 16 16:27:26.993292 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:26.993252 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.001787 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.001760 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9"] Apr 16 16:27:27.110400 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.110366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-lib-modules\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.110916 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.110655 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-proc\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.110916 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.110717 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p8lm\" (UniqueName: \"kubernetes.io/projected/61994914-186c-40ac-8b91-13c82cc63165-kube-api-access-6p8lm\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.110916 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.110771 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-sys\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.110916 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.110816 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-podres\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212318 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212277 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-proc\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p8lm\" (UniqueName: \"kubernetes.io/projected/61994914-186c-40ac-8b91-13c82cc63165-kube-api-access-6p8lm\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-proc\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212405 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-sys\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212506 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212460 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-podres\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212766 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212512 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-lib-modules\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212766 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212700 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-lib-modules\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212766 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212727 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-sys\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.212928 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.212798 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/61994914-186c-40ac-8b91-13c82cc63165-podres\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.221216 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.221189 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p8lm\" (UniqueName: \"kubernetes.io/projected/61994914-186c-40ac-8b91-13c82cc63165-kube-api-access-6p8lm\") pod \"perf-node-gather-daemonset-h7wd9\" (UID: \"61994914-186c-40ac-8b91-13c82cc63165\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.304511 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.304434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:27.447640 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.447589 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9"] Apr 16 16:27:27.449544 ip-10-0-136-220 kubenswrapper[2566]: W0416 16:27:27.449515 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod61994914_186c_40ac_8b91_13c82cc63165.slice/crio-d89ae6e27bb46e61d5263a04d4e8ca8b4b3f36e7d1e0aa436bc98fc1f12904d0 WatchSource:0}: Error finding container d89ae6e27bb46e61d5263a04d4e8ca8b4b3f36e7d1e0aa436bc98fc1f12904d0: Status 404 returned error can't find the container with id d89ae6e27bb46e61d5263a04d4e8ca8b4b3f36e7d1e0aa436bc98fc1f12904d0 Apr 16 16:27:27.564967 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.564897 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fm66n_045826c8-ee95-494f-8d06-d8d18b2717ca/dns/0.log" Apr 16 16:27:27.584911 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.584886 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fm66n_045826c8-ee95-494f-8d06-d8d18b2717ca/kube-rbac-proxy/0.log" Apr 16 16:27:27.675878 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:27.675849 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjkzm_b54e0816-7a3e-49eb-bc50-eebcbb3a03c2/dns-node-resolver/0.log" Apr 16 16:27:28.088165 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.088133 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-77c9bfcb4d-8wlph_9ea9ebe0-ee87-4861-b4c4-b7866eef641d/registry/0.log" Apr 16 16:27:28.106984 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.106960 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kl4jm_19e9bbf4-2f93-4060-aa4f-2838412f8254/node-ca/0.log" Apr 16 16:27:28.390856 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.390822 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" event={"ID":"61994914-186c-40ac-8b91-13c82cc63165","Type":"ContainerStarted","Data":"3b513b8b9e6ad514028af35ff8e0b10d52a6dc640ce9e036cf6fab9a0712dcc8"} Apr 16 16:27:28.390856 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.390862 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" event={"ID":"61994914-186c-40ac-8b91-13c82cc63165","Type":"ContainerStarted","Data":"d89ae6e27bb46e61d5263a04d4e8ca8b4b3f36e7d1e0aa436bc98fc1f12904d0"} Apr 16 16:27:28.391076 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.390940 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:28.408563 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.408496 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" podStartSLOduration=2.408474866 podStartE2EDuration="2.408474866s" podCreationTimestamp="2026-04-16 16:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:27:28.406272259 +0000 UTC m=+1467.846800726" watchObservedRunningTime="2026-04-16 16:27:28.408474866 +0000 UTC m=+1467.849003322" Apr 16 16:27:28.861207 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:28.861139 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-568f646654-h7nqt_403c812f-312a-4792-b1da-b54e362000a8/router/0.log" Apr 16 16:27:29.240442 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:29.240413 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pxsb9_deb31f3a-b5fe-4b80-a8dd-2ef625183254/serve-healthcheck-canary/0.log" Apr 16 16:27:29.773308 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:29.773278 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdrp4_6cabfaf7-a334-4298-988b-7cb9b7e35302/kube-rbac-proxy/0.log" Apr 16 16:27:29.793639 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:29.793594 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdrp4_6cabfaf7-a334-4298-988b-7cb9b7e35302/exporter/0.log" Apr 16 16:27:29.814059 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:29.814031 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdrp4_6cabfaf7-a334-4298-988b-7cb9b7e35302/extractor/0.log" Apr 16 16:27:34.404635 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:34.404592 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-h7wd9" Apr 16 16:27:37.306661 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.306632 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/kube-multus-additional-cni-plugins/0.log" Apr 16 16:27:37.327991 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.327966 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/egress-router-binary-copy/0.log" Apr 16 16:27:37.347892 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.347869 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/cni-plugins/0.log" Apr 16 16:27:37.367793 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.367772 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/bond-cni-plugin/0.log" Apr 16 16:27:37.389553 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.389528 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/routeoverride-cni/0.log" Apr 16 16:27:37.410914 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.410884 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/whereabouts-cni-bincopy/0.log" Apr 16 16:27:37.432937 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.432904 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pmzx9_eb39d18c-d897-41cd-b539-6c31f7f376e3/whereabouts-cni/0.log" Apr 16 16:27:37.486158 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.486125 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vscz2_06e6e0e3-2b76-41df-bcbc-8f60a89cd7e7/kube-multus/0.log" Apr 16 16:27:37.579962 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.579885 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4kkrg_3e5c15b5-f8c7-478b-a327-14aad8952c3f/network-metrics-daemon/0.log" Apr 16 16:27:37.606811 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:37.606780 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4kkrg_3e5c15b5-f8c7-478b-a327-14aad8952c3f/kube-rbac-proxy/0.log" Apr 16 16:27:38.430076 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.430045 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-controller/0.log" Apr 16 16:27:38.448982 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.448959 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/0.log" Apr 16 16:27:38.456056 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.456025 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovn-acl-logging/1.log" Apr 16 16:27:38.476144 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.476120 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/kube-rbac-proxy-node/0.log" Apr 16 16:27:38.496138 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.496111 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:27:38.513935 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.513910 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/northd/0.log" Apr 16 16:27:38.535573 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.535546 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/nbdb/0.log" Apr 16 16:27:38.555766 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.555666 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/sbdb/0.log" Apr 16 16:27:38.660669 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:38.660627 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkhjf_06eb6e48-21ae-44ee-bf36-e4206b109746/ovnkube-controller/0.log" Apr 16 16:27:40.090952 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:40.090928 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-59cwx_eedc515d-9a62-481f-9dbd-8fe29169a6b1/check-endpoints/0.log" Apr 16 16:27:40.113439 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:40.113403 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gvxkh_99da9992-0d67-494e-853e-a94744056361/network-check-target-container/0.log" Apr 16 16:27:41.064171 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:41.064143 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lhzgg_f10efae8-7168-40df-b502-62b0d2d36756/iptables-alerter/0.log" Apr 16 16:27:41.742003 ip-10-0-136-220 kubenswrapper[2566]: I0416 16:27:41.741972 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-72bdw_2f64ce22-0fab-4753-8319-62ac8a354b24/tuned/0.log"